[AppEx AI] Integrate pre-configured Kibana AI Connector to the O11y and Search GenAI powered functionality #202626
Labels
8.18 candidate
Team:Observability
Team label for Observability Team (for things that are handled across all of observability)
v8.18.0
Currently solutions AI Assistants are using three different GenAI connector types, which helps to integrate directly with OpenAI, Gemini and Bedrock.
The long term vision is to migrate from the explicit connectors implementation to the usage of the generic
.inference
connector type #189027, which is integrated with Elasticsearch Inference API. This connector supports multiple LLMs integrations (including OpenAI, Gemini, Bedrock and much more), but due to the big scope of the complete adoption and migration path to.inference
connector, the scope was reduced to the MVP integration which includes the usage of the pre-configured connector with EIS service.In 8.18 there is a plan to provide Elastic Default LLM experience, which will be exposed within Kibana
.inference
connector type.Requirements:
.inference
pre-configured connector instance to the list of the available connectors instances and select/use it by default (if the connector selection was not changed).inference
connector instances._unified
completion introduced by [AI Connector] Kibana inference connector should support openai compatible schema for completion task #202621The text was updated successfully, but these errors were encountered: