-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ML] Inference APIs are now available in serverless #178024
[ML] Inference APIs are now available in serverless #178024
Conversation
The dev console command completion files now correctly state that the inference APIs are available in serverless. Followup to elastic#173014 and elastic/elasticsearch-specification#2414
Pinging @elastic/ml-ui (:ml) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
💚 Build Succeeded
Metrics [docs]
History
To update your PR or re-run it, just comment with: cc @droberts195 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this change! Tested locally and verified that these endpoints are now suggested in serverless console.
Thanks for the review @ElenaStoeva |
Summary
The dev console command completion files now correctly state that the inference APIs are available in serverless.
Followup to #173014 and elastic/elasticsearch-specification#2414