-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Allow 3rd party ML connectors #1556
Comments
@binarymax The SageMaker example (https://opensearch.org/docs/latest/ml-commons-plugin/extensibility/index/#adding-trusted-endpoints) is basically an example of an arbitrary (3rd party) connector. Can you give that a try? @ylwu-amzn We should mention that in the doc and let people know they can use it to point to any arbitrary endpoint. |
Thanks @austintlee , Good point. Will mention this in the document. Yes, actually you can point to any endpoint. @binarymax If you have some model endpoint , and sample request, I can also help build some blueprint for you just like these https://github.com/opensearch-project/ml-commons/tree/2.x/docs/remote_inference_blueprints |
Thanks guys. Apologies if I didn't write the feature clearly. The issue is that with 3rd party endpoints, the pre/post processor logic needs to be implemented with painless scripting now - which is far from ideal. In discussions I've had with Dylan Tong, we thought implementing json-path pre/post processor values would be much better. |
@binarymax do you have any example docs for your model ? We can help build some pre/post processors. |
@binarymax could you please provide us some examples? |
Would you have an example of a blueprint for connecting to azure openai? |
Hi @dhrubo-os and @ylwu-amzn , sorry I got totally wrapped up in other work, but I had a working version on my laptop. Trying to find the time to clean it up and open a PR. |
Currently, only SageMaker, OpenAI, or Cohere connector endpoints are allowed via blueprints. Enabling other 3rd party connector endpoints requires changing code in ml-commons.
What solution would you like?
It should be possible to connect any 3rd party endpoint without Painless scripting, and pre/post processors should be possible using json-path.
What alternatives have you considered?
Making a 3rd party connector which enforces a pre-defined API. This seems too restrictive, but will be used as a temporary solution to enable a Mighty Inference Server connector: https://max.io/mighty.html
Do you have any additional context?
This feature is dependent on #1555
The text was updated successfully, but these errors were encountered: