You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now mainly proprietary LLMs are supported. Would be great to also support DIY/OSS LLMs - for instance, hosted in Databricks Model Serving endpoints. Or more holistically, LLMs deployed behind a Web API running in a container, for instance.
Motivation, pitch
I think this will be super useful for people or companies who are willing to do prompt engineering with Open Source LLMs. Also happy to work on this feature.
Alternatives
Could also allow for testing prompts targetting models that are running on the local machine
Additional context
No response
The text was updated successfully, but these errors were encountered:
Thanks for opening this issue! That sounds like a great integration for us to add! If you or anyone else would like to open a PR, we would be more than happy to review it or support you in any way.
We are also happy to discuss other potential integration with Databricks on our Discord or through other channels. Just let us know!
🚀 The feature
Right now mainly proprietary LLMs are supported. Would be great to also support DIY/OSS LLMs - for instance, hosted in Databricks Model Serving endpoints. Or more holistically, LLMs deployed behind a Web API running in a container, for instance.
Motivation, pitch
I think this will be super useful for people or companies who are willing to do prompt engineering with Open Source LLMs. Also happy to work on this feature.
Alternatives
Could also allow for testing prompts targetting models that are running on the local machine
Additional context
No response
The text was updated successfully, but these errors were encountered: