Skip to content

Commit

Permalink
[Security KB] Fix setup KB (elastic#201175)
Browse files Browse the repository at this point in the history
## Summary

Fix an issue with auto-recovery of Knowledge Base setup.

When the KB setup was initialized on an undersized cluster, the model
failed to deploy correctly. This resulted in the KB ending up in a
broken state, repeatedly displaying the Setup KB button.

---------

Co-authored-by: kibanamachine <[email protected]>
(cherry picked from commit 1cb56d7)

# Conflicts:
#	x-pack/plugins/elastic_assistant/server/ai_assistant_data_clients/knowledge_base/index.ts
  • Loading branch information
patrykkopycinski committed Dec 1, 2024
1 parent 8707524 commit 0e2f9fb
Showing 1 changed file with 35 additions and 17 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -225,27 +225,45 @@ export class AIAssistantKnowledgeBaseDataClient extends AIAssistantDataClient {
public createInferenceEndpoint = async () => {
const elserId = await this.options.getElserId();
this.options.logger.debug(`Deploying ELSER model '${elserId}'...`);
try {
const esClient = await this.options.elasticsearchClientPromise;
if (this.isV2KnowledgeBaseEnabled) {
await esClient.inference.put({
task_type: 'sparse_embedding',
const esClient = await this.options.elasticsearchClientPromise;

const inferenceEndpointExists = await this.isInferenceEndpointExists();

if (inferenceEndpointExists) {
try {
await esClient.inference.delete({
inference_id: ASSISTANT_ELSER_INFERENCE_ID,
inference_config: {
service: 'elasticsearch',
service_settings: {
adaptive_allocations: {
enabled: true,
min_number_of_allocations: 0,
max_number_of_allocations: 8,
},
num_threads: 1,
model_id: elserId,
// it's being used in the mapping so we need to force delete
force: true,
});
this.options.logger.debug(`Deleted existing inference endpoint for ELSER model '${elserId}'`);
} catch (error) {
this.options.logger.error(
`Error deleting inference endpoint for ELSER model '${elserId}':\n${error}`
);
}
}

try {
await esClient.inference.put({
task_type: 'sparse_embedding',
inference_id: ASSISTANT_ELSER_INFERENCE_ID,
inference_config: {
service: 'elasticsearch',
service_settings: {
adaptive_allocations: {
enabled: true,
min_number_of_allocations: 0,
max_number_of_allocations: 8,
},
task_settings: {},
},
});
}
task_settings: {},
},
});

// await for the model to be deployed
await this.isInferenceEndpointExists();
} catch (error) {
this.options.logger.error(
`Error creating inference endpoint for ELSER model '${elserId}':\n${error}`
Expand Down

0 comments on commit 0e2f9fb

Please sign in to comment.