-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support OpenAPI #37
Comments
/kind feature |
I think this issue should be the highest priority, so that the conversation UI, etc. can be integrated into the production environment as soon as possible. I think more people can be interested in participating. What‘s your opinion? @kerthcet |
I think we need an unified platform to communicate with all kinds of models, whether OpenAPIs or llmaz API. For llmaz API, we can work on this step by step:
Then it would looks like we have a list of Models
and people can choose to chat with any kind of model. Any suggestions? |
Sorry, I misunderstood this issuse. What I mean is that llmaz opens a set of interfaces that conform to openapi input and output, so that users can find usage scenarios as soon as possible. For example, various open source chatuis can be integrated with this project, and llmaz is used as the backend to allow users to quickly practice customized models. But this issuse should be more to direct support openapi interface integration rather than model integration Is that right? |
I believe we're already APIs following OpenAPI specifications, like we can visit But yes, the scheme is a little complex comparing to other apis like in vllm, it uses I have no idea whether other chatbots would like to integrate with our project, and I don't think there's a standard protocol exist, like how the model list api should look like or how to create the model, what's the parameters. But we can still start with our dashboard and once there's a standard protocol, exporting the apis would be really easy, or we can provide a python library as well. |
Right now, we support inference engines like vllm for inference, what if people want calling OpenAPIs like chatGPT, it's easy to integrate.
The text was updated successfully, but these errors were encountered: