-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add AI connector blueprints for (Azure) OpenAI Ada embedding #1367
Comments
For anyone came across this issue with the same question as mine, here is how I made it work. Hope it would help. AI connector for Azure Ada EmbeddingPOST /_plugins/_ml/connectors/_create
{
"name": "<YOUR CONNECTOR NAME>",
"description": "<YOUR CONNECTOR DESCRIPTION>",
"version": "<YOUR CONNECTOR VERSION>",
"protocol": "http",
"parameters": {
"endpoint": "<YOUR-ORG-NAME>.openai.azure.com/",
"deploy-name": "<YOUR-DEPLOYMENT-NAME>",
"model": "text-embedding-ada-002",
"api-version": "2023-07-01-preview",
"temperature": 0.0
},
"credential": {
"openAI_key": "YOUR-API-KEY"
},
"actions": [
{
"action_type": "predict",
"method": "POST",
"url": "https://${parameters.endpoint}/openai/deployments/${parameters.deploy-name}/embeddings?api-version=${parameters.api-version}",
"headers": {
"api-key": "${credential.openAI_key}"
},
"request_body": "{ \"input\": \"${parameters.input}\" }"
}
]
} POST /_plugins/_ml/models/<ENTER MODEL ID HERE>/_predict
{
"parameters": {
"input": "a test message"
}
} AI connector for Azure GPT 3.5POST /_plugins/_ml/connectors/_create
{
"name": "<YOUR CONNECTOR NAME>",
"description": "<YOUR CONNECTOR DESCRIPTION>",
"version": "<YOUR CONNECTOR VERSION>",
"protocol": "http",
"parameters": {
"endpoint": "<YOUR-ORG-NAME>.openai.azure.com/",
"deploy-name": "<YOUR-DEPLOYMENT-NAME>",
"model": "gpt-3.5-turbo",
"api-version": "2023-07-01-preview",
"temperature": 0.0
},
"credential": {
"openAI_key": "YOUR-API-KEY"
},
"actions": [
{
"action_type": "predict",
"method": "POST",
"url": "https://${parameters.endpoint}/openai/deployments/${parameters.deploy-name}/chat/completions?api-version=${parameters.api-version}",
"headers": {
"api-key": "${credential.openAI_key}"
},
"request_body": "{ \"messages\": ${parameters.messages}, \"temperature\": ${parameters.temperature} }"
}
]
} POST /_plugins/_ml/models/<ENTER MODEL ID HERE>/_predict
{
"parameters": {
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Hello!"
}
]
}
} |
Thank you so much for sharing your solution! |
@shengbo-ma Thanks for sharing this. Can you publish a PR to add this blueprint to https://github.com/opensearch-project/ml-commons/tree/main/docs/remote_inference_blueprints |
Hi @ylwu-amzn Glad to contribute. I have sent a PR with updated blueprints to work with recent updates of Azure OpenAI and OpenSearch. Please kindly review and share your comments. |
Is your feature request related to a problem?
Add AI connector blueprint for (Azure) OpenAI Ada embedding in addition to existing ones. If it is already a plan in future release, please close this issue.
What solution would you like?
A markdown file of AI connector blueprint for (Azure) OpenAI Ada embedding model
What alternatives have you considered?
I am trying to write it myself according to OpenSearch Doc here, and also using existing blueprints here
Do you have any additional context?
Very excited to see ML extensibility feature which allows endpoint-based LLMs. I am trying to create my own blueprints for Azure OpenAI ada embedding. If I got it work, maybe I could create a PR to contribute, if it is considered useful.
The text was updated successfully, but these errors were encountered: