Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Allow 3rd party ML connectors #1556

Open
binarymax opened this issue Oct 26, 2023 · 7 comments
Open

[FEATURE] Allow 3rd party ML connectors #1556

binarymax opened this issue Oct 26, 2023 · 7 comments
Labels
enhancement New feature or request feature

Comments

@binarymax
Copy link

Currently, only SageMaker, OpenAI, or Cohere connector endpoints are allowed via blueprints. Enabling other 3rd party connector endpoints requires changing code in ml-commons.

What solution would you like?
It should be possible to connect any 3rd party endpoint without Painless scripting, and pre/post processors should be possible using json-path.

What alternatives have you considered?
Making a 3rd party connector which enforces a pre-defined API. This seems too restrictive, but will be used as a temporary solution to enable a Mighty Inference Server connector: https://max.io/mighty.html

Do you have any additional context?
This feature is dependent on #1555

@binarymax binarymax added enhancement New feature or request untriaged labels Oct 26, 2023
@austintlee
Copy link
Collaborator

@binarymax The SageMaker example (https://opensearch.org/docs/latest/ml-commons-plugin/extensibility/index/#adding-trusted-endpoints) is basically an example of an arbitrary (3rd party) connector. Can you give that a try?

@ylwu-amzn We should mention that in the doc and let people know they can use it to point to any arbitrary endpoint.

@ylwu-amzn
Copy link
Collaborator

Thanks @austintlee , Good point. Will mention this in the document.

Yes, actually you can point to any endpoint. @binarymax If you have some model endpoint , and sample request, I can also help build some blueprint for you just like these https://github.com/opensearch-project/ml-commons/tree/2.x/docs/remote_inference_blueprints

@binarymax
Copy link
Author

Thanks guys. Apologies if I didn't write the feature clearly. The issue is that with 3rd party endpoints, the pre/post processor logic needs to be implemented with painless scripting now - which is far from ideal. In discussions I've had with Dylan Tong, we thought implementing json-path pre/post processor values would be much better.

@ylwu-amzn
Copy link
Collaborator

@binarymax do you have any example docs for your model ? We can help build some pre/post processors.

@ylwu-amzn ylwu-amzn moved this to Untriaged in ml-commons projects Nov 3, 2023
@dhrubo-os
Copy link
Collaborator

@binarymax could you please provide us some examples?

@drobbins-ancile
Copy link

Thanks @austintlee , Good point. Will mention this in the document.

Yes, actually you can point to any endpoint. @binarymax If you have some model endpoint , and sample request, I can also help build some blueprint for you just like these 2.x/docs/remote_inference_blueprints

Would you have an example of a blueprint for connecting to azure openai?

@binarymax
Copy link
Author

Hi @dhrubo-os and @ylwu-amzn , sorry I got totally wrapped up in other work, but I had a working version on my laptop. Trying to find the time to clean it up and open a PR.

@b4sjoo b4sjoo moved this from Untriaged to In Progress in ml-commons projects Jan 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature
Projects
Status: In Progress
Development

No branches or pull requests

6 participants