Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LM Studio Server option for a custom OpenAI compatible endpoint #37

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

willasaywhat
Copy link

This PR adds support for a custom, user defined endpoint URI for use with LM Studio. This allows for self hosted, local AI model usage with an OpenAI compatible API.

Blindly editing to add in a local server option via LM Studio.
@th3f001
Copy link

th3f001 commented Aug 2, 2024

+1 to merge this ....and possibly add also ollama as local LLM-inference provider

@chenweisomebody126
Copy link

Yeah, I like this

Copy link

@chenweisomebody126 chenweisomebody126 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@chenweisomebody126
Copy link

@willasaywhat I can merge your branch into mine like what I did in chenweisomebody126#1. I feel I would like to support more models as you suggested.

@willasaywhat
Copy link
Author

Sure, that makes sense! I still need to check out the branch that the repo owner put together as well: #25 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants