Skip to content

Commit

Permalink
[8.x] [inference] Add support for inference connectors (#204541) (#20…
Browse files Browse the repository at this point in the history
…5078)

# Backport

This will backport the following commits from `main` to `8.x`:
- [[inference] Add support for inference connectors
(#204541)](#204541)

<!--- Backport version: 9.4.3 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sqren/backport)

<!--BACKPORT [{"author":{"name":"Pierre
Gayvallet","email":"[email protected]"},"sourceCommit":{"committedDate":"2024-12-23T09:20:42Z","message":"[inference]
Add support for inference connectors (#204541)\n\n##
Summary\r\n\r\n~Depends on~
#200249 merged!\r\n\r\nFix
https://github.com/elastic/kibana/issues/199082\r\n\r\n- Add support for
the `inference` stack connectors to the `inference`\r\nplugin
(everything is inference)\r\n- Adapt the o11y assistant to use the
`inference-common` utilities for\r\nconnector filtering / compat
checking\r\n\r\n## How to test\r\n\r\n**1. Starts ES with the unified
completion feature flag**\r\n\r\n```sh\r\nyarn es snapshot --license
trial
ES_JAVA_OPTS=\"-Des.inference_unified_feature_flag_enabled=true\"\r\n```\r\n\r\n**2.
Enable the inference connector for Kibana**\r\n\r\nIn the Kibana config
file:\r\n```yaml\r\nxpack.stack_connectors.enableExperimental:
['inferenceConnectorOn']\r\n```\r\n\r\n**3. Start Dev
Kibana**\r\n\r\n```sh\r\nnode scripts/kibana --dev
--no-base-path\r\n```\r\n\r\n**4. Create an inference
connector**\r\n\r\nGo
to\r\n`http://localhost:5601/app/management/insightsAndAlerting/triggersActionsConnectors/connectors`,\r\ncreate
an inference connector\r\n\r\n- Type: `AI
connector`\r\n\r\nthen\r\n\r\n- Service: `OpenAI`\r\n- API Key: Gwzk...
Kidding, please ping someone\r\n- Model ID: `gpt-4o`\r\n- Task type:
`completion`\r\n\r\n-> save\r\n\r\n**5. test the o11y
assistant**\r\n\r\nUse the assistant as you would do for any other
connector (just make\r\nsure the inference connector is selected as the
one being used) and do\r\nyour
testing.\r\n\r\n---------\r\n\r\nCo-authored-by: kibanamachine
<[email protected]>","sha":"3dcae5144034a146068566e920ade2e57d9abd08","branchLabelMapping":{"^v9.0.0$":"main","^v8.18.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","Team:Obs
AI Assistant","backport:version","Team:AI
Infra","v8.18.0"],"title":"[inference] Add support for inference
connectors","number":204541,"url":"https://github.com/elastic/kibana/pull/204541","mergeCommit":{"message":"[inference]
Add support for inference connectors (#204541)\n\n##
Summary\r\n\r\n~Depends on~
#200249 merged!\r\n\r\nFix
https://github.com/elastic/kibana/issues/199082\r\n\r\n- Add support for
the `inference` stack connectors to the `inference`\r\nplugin
(everything is inference)\r\n- Adapt the o11y assistant to use the
`inference-common` utilities for\r\nconnector filtering / compat
checking\r\n\r\n## How to test\r\n\r\n**1. Starts ES with the unified
completion feature flag**\r\n\r\n```sh\r\nyarn es snapshot --license
trial
ES_JAVA_OPTS=\"-Des.inference_unified_feature_flag_enabled=true\"\r\n```\r\n\r\n**2.
Enable the inference connector for Kibana**\r\n\r\nIn the Kibana config
file:\r\n```yaml\r\nxpack.stack_connectors.enableExperimental:
['inferenceConnectorOn']\r\n```\r\n\r\n**3. Start Dev
Kibana**\r\n\r\n```sh\r\nnode scripts/kibana --dev
--no-base-path\r\n```\r\n\r\n**4. Create an inference
connector**\r\n\r\nGo
to\r\n`http://localhost:5601/app/management/insightsAndAlerting/triggersActionsConnectors/connectors`,\r\ncreate
an inference connector\r\n\r\n- Type: `AI
connector`\r\n\r\nthen\r\n\r\n- Service: `OpenAI`\r\n- API Key: Gwzk...
Kidding, please ping someone\r\n- Model ID: `gpt-4o`\r\n- Task type:
`completion`\r\n\r\n-> save\r\n\r\n**5. test the o11y
assistant**\r\n\r\nUse the assistant as you would do for any other
connector (just make\r\nsure the inference connector is selected as the
one being used) and do\r\nyour
testing.\r\n\r\n---------\r\n\r\nCo-authored-by: kibanamachine
<[email protected]>","sha":"3dcae5144034a146068566e920ade2e57d9abd08"}},"sourceBranch":"main","suggestedTargetBranches":["8.x"],"targetPullRequestStates":[{"branch":"main","label":"v9.0.0","branchLabelMappingKey":"^v9.0.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/204541","number":204541,"mergeCommit":{"message":"[inference]
Add support for inference connectors (#204541)\n\n##
Summary\r\n\r\n~Depends on~
#200249 merged!\r\n\r\nFix
https://github.com/elastic/kibana/issues/199082\r\n\r\n- Add support for
the `inference` stack connectors to the `inference`\r\nplugin
(everything is inference)\r\n- Adapt the o11y assistant to use the
`inference-common` utilities for\r\nconnector filtering / compat
checking\r\n\r\n## How to test\r\n\r\n**1. Starts ES with the unified
completion feature flag**\r\n\r\n```sh\r\nyarn es snapshot --license
trial
ES_JAVA_OPTS=\"-Des.inference_unified_feature_flag_enabled=true\"\r\n```\r\n\r\n**2.
Enable the inference connector for Kibana**\r\n\r\nIn the Kibana config
file:\r\n```yaml\r\nxpack.stack_connectors.enableExperimental:
['inferenceConnectorOn']\r\n```\r\n\r\n**3. Start Dev
Kibana**\r\n\r\n```sh\r\nnode scripts/kibana --dev
--no-base-path\r\n```\r\n\r\n**4. Create an inference
connector**\r\n\r\nGo
to\r\n`http://localhost:5601/app/management/insightsAndAlerting/triggersActionsConnectors/connectors`,\r\ncreate
an inference connector\r\n\r\n- Type: `AI
connector`\r\n\r\nthen\r\n\r\n- Service: `OpenAI`\r\n- API Key: Gwzk...
Kidding, please ping someone\r\n- Model ID: `gpt-4o`\r\n- Task type:
`completion`\r\n\r\n-> save\r\n\r\n**5. test the o11y
assistant**\r\n\r\nUse the assistant as you would do for any other
connector (just make\r\nsure the inference connector is selected as the
one being used) and do\r\nyour
testing.\r\n\r\n---------\r\n\r\nCo-authored-by: kibanamachine
<[email protected]>","sha":"3dcae5144034a146068566e920ade2e57d9abd08"}},{"branch":"8.x","label":"v8.18.0","branchLabelMappingKey":"^v8.18.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}]
BACKPORT-->

---------

Co-authored-by: Pierre Gayvallet <[email protected]>
  • Loading branch information
kibanamachine and pgayvallet authored Dec 23, 2024
1 parent 9b9ce42 commit a08a128
Show file tree
Hide file tree
Showing 38 changed files with 984 additions and 297 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ import { css } from '@emotion/css';
import { EuiFlexGroup, EuiFlexItem, EuiSpacer, useCurrentEuiBreakpoint } from '@elastic/eui';
import type { ActionConnector } from '@kbn/triggers-actions-ui-plugin/public';
import { GenerativeAIForObservabilityConnectorFeatureId } from '@kbn/actions-plugin/common';
import { isSupportedConnectorType } from '@kbn/observability-ai-assistant-plugin/public';
import { isSupportedConnectorType } from '@kbn/inference-common';
import { AssistantBeacon } from '@kbn/ai-assistant-icon';
import type { UseKnowledgeBaseResult } from '../hooks/use_knowledge_base';
import type { UseGenAIConnectorsResult } from '../hooks/use_genai_connectors';
Expand Down
1 change: 1 addition & 0 deletions x-pack/packages/kbn-ai-assistant/tsconfig.json
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@
"@kbn/ml-plugin",
"@kbn/share-plugin",
"@kbn/ai-assistant-common",
"@kbn/inference-common",
"@kbn/storybook",
"@kbn/ai-assistant-icon",
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -95,3 +95,9 @@ export {
} from './src/errors';

export { truncateList } from './src/truncate_list';
export {
InferenceConnectorType,
isSupportedConnectorType,
isSupportedConnector,
type InferenceConnector,
} from './src/connectors';
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

import {
InferenceConnectorType,
isSupportedConnectorType,
isSupportedConnector,
RawConnector,
COMPLETION_TASK_TYPE,
} from './connectors';

const createRawConnector = (parts: Partial<RawConnector>): RawConnector => {
return {
id: 'id',
actionTypeId: 'connector-type',
name: 'some connector',
config: {},
...parts,
};
};

describe('isSupportedConnectorType', () => {
it('returns true for supported connector types', () => {
expect(isSupportedConnectorType(InferenceConnectorType.OpenAI)).toBe(true);
expect(isSupportedConnectorType(InferenceConnectorType.Bedrock)).toBe(true);
expect(isSupportedConnectorType(InferenceConnectorType.Gemini)).toBe(true);
expect(isSupportedConnectorType(InferenceConnectorType.Inference)).toBe(true);
});
it('returns false for unsupported connector types', () => {
expect(isSupportedConnectorType('anything-else')).toBe(false);
});
});

describe('isSupportedConnector', () => {
// TODO

it('returns true for OpenAI connectors', () => {
expect(
isSupportedConnector(createRawConnector({ actionTypeId: InferenceConnectorType.OpenAI }))
).toBe(true);
});

it('returns true for Bedrock connectors', () => {
expect(
isSupportedConnector(createRawConnector({ actionTypeId: InferenceConnectorType.Bedrock }))
).toBe(true);
});

it('returns true for Gemini connectors', () => {
expect(
isSupportedConnector(createRawConnector({ actionTypeId: InferenceConnectorType.Gemini }))
).toBe(true);
});

it('returns true for OpenAI connectors with the right taskType', () => {
expect(
isSupportedConnector(
createRawConnector({
actionTypeId: InferenceConnectorType.Inference,
config: { taskType: COMPLETION_TASK_TYPE },
})
)
).toBe(true);
});

it('returns false for OpenAI connectors with a bad taskType', () => {
expect(
isSupportedConnector(
createRawConnector({
actionTypeId: InferenceConnectorType.Inference,
config: { taskType: 'embeddings' },
})
)
).toBe(false);
});

it('returns false for OpenAI connectors without taskType', () => {
expect(
isSupportedConnector(
createRawConnector({
actionTypeId: InferenceConnectorType.Inference,
config: {},
})
)
).toBe(false);
});
});
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

/**
* The list of connector types that can be used with the inference APIs
*/
export enum InferenceConnectorType {
OpenAI = '.gen-ai',
Bedrock = '.bedrock',
Gemini = '.gemini',
Inference = '.inference',
}

export const COMPLETION_TASK_TYPE = 'completion';

const allSupportedConnectorTypes = Object.values(InferenceConnectorType);

export interface InferenceConnector {
type: InferenceConnectorType;
name: string;
connectorId: string;
}

/**
* Checks if a given connector type is compatible for inference.
*
* Note: this check is not sufficient to assert if a given connector can be
* used for inference, as `.inference` connectors need additional check logic.
* Please use `isSupportedConnector` instead when possible.
*/
export function isSupportedConnectorType(id: string): id is InferenceConnectorType {
return allSupportedConnectorTypes.includes(id as InferenceConnectorType);
}

/**
* Checks if a given connector is compatible for inference.
*
* A connector is compatible if:
* 1. its type is in the list of allowed types
* 2. for inference connectors, if its taskType is "completion"
*/
export function isSupportedConnector(connector: RawConnector): connector is RawInferenceConnector {
if (!isSupportedConnectorType(connector.actionTypeId)) {
return false;
}
if (connector.actionTypeId === InferenceConnectorType.Inference) {
const config = connector.config ?? {};
if (config.taskType !== COMPLETION_TASK_TYPE) {
return false;
}
}
return true;
}

/**
* Connector types are living in the actions plugin and we can't afford
* having dependencies from this package to some mid-level plugin,
* so we're just using our own connector mixin type.
*/
export interface RawConnector {
id: string;
actionTypeId: string;
name: string;
config?: Record<string, any>;
}

interface RawInferenceConnector {
id: string;
actionTypeId: InferenceConnectorType;
name: string;
config?: Record<string, any>;
}
24 changes: 0 additions & 24 deletions x-pack/platform/plugins/shared/inference/common/connectors.ts

This file was deleted.

8 changes: 6 additions & 2 deletions x-pack/platform/plugins/shared/inference/common/http_apis.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,12 @@
* 2.0.
*/

import type { FunctionCallingMode, Message, ToolOptions } from '@kbn/inference-common';
import { InferenceConnector } from './connectors';
import type {
FunctionCallingMode,
Message,
ToolOptions,
InferenceConnector,
} from '@kbn/inference-common';

export type ChatCompleteRequestBody = {
connectorId: string;
Expand Down
3 changes: 1 addition & 2 deletions x-pack/platform/plugins/shared/inference/public/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,7 @@
* 2.0.
*/

import type { ChatCompleteAPI, OutputAPI } from '@kbn/inference-common';
import type { InferenceConnector } from '../common/connectors';
import type { ChatCompleteAPI, OutputAPI, InferenceConnector } from '@kbn/inference-common';

/* eslint-disable @typescript-eslint/no-empty-interface*/

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,9 @@ import {
withoutOutputUpdateEvents,
type ToolOptions,
ChatCompleteOptions,
type InferenceConnector,
} from '@kbn/inference-common';
import type { ChatCompleteRequestBody } from '../../common/http_apis';
import type { InferenceConnector } from '../../common/connectors';
import { createOutputApi } from '../../common/output/create_output_api';
import { eventSourceStreamIntoObservable } from '../../server/util/event_source_stream_into_observable';

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,12 @@
* 2.0.
*/

import { InferenceConnectorType } from '../../../common/connectors';
import { InferenceConnectorType } from '@kbn/inference-common';
import { getInferenceAdapter } from './get_inference_adapter';
import { openAIAdapter } from './openai';
import { geminiAdapter } from './gemini';
import { bedrockClaudeAdapter } from './bedrock';
import { inferenceAdapter } from './inference';

describe('getInferenceAdapter', () => {
it('returns the openAI adapter for OpenAI type', () => {
Expand All @@ -23,4 +24,8 @@ describe('getInferenceAdapter', () => {
it('returns the bedrock adapter for Bedrock type', () => {
expect(getInferenceAdapter(InferenceConnectorType.Bedrock)).toBe(bedrockClaudeAdapter);
});

it('returns the inference adapter for Inference type', () => {
expect(getInferenceAdapter(InferenceConnectorType.Inference)).toBe(inferenceAdapter);
});
});
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,12 @@
* 2.0.
*/

import { InferenceConnectorType } from '../../../common/connectors';
import { InferenceConnectorType } from '@kbn/inference-common';
import type { InferenceConnectorAdapter } from '../types';
import { openAIAdapter } from './openai';
import { geminiAdapter } from './gemini';
import { bedrockClaudeAdapter } from './bedrock';
import { inferenceAdapter } from './inference';

export const getInferenceAdapter = (
connectorType: InferenceConnectorType
Expand All @@ -23,6 +24,9 @@ export const getInferenceAdapter = (

case InferenceConnectorType.Bedrock:
return bedrockClaudeAdapter;

case InferenceConnectorType.Inference:
return inferenceAdapter;
}

return undefined;
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

export { inferenceAdapter } from './inference_adapter';
Loading

0 comments on commit a08a128

Please sign in to comment.