Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Localhost Version (Open Source LLM) #1

Open
hp0404 opened this issue Aug 23, 2024 · 2 comments
Open

Feature Request: Localhost Version (Open Source LLM) #1

hp0404 opened this issue Aug 23, 2024 · 2 comments

Comments

@hp0404
Copy link

hp0404 commented Aug 23, 2024

Hi!

First off, great work on askalex! It's super helpful.

I was wondering if you could add support for running the model on localhost instead of relying on OpenAI. This would be awesome for those who prefer using open-source LLMs locally to save on API costs.

It would be cool if we could choose between OpenAI's API and a local model in the settings.
e.g.

client = OpenAI(base_url="http://localhost:11434/v1", api_key="...")

Maybe you could also include some instructions on how to set up a local model?

Thanks for considering this!

@hp0404
Copy link
Author

hp0404 commented Aug 25, 2024

I've modified the project to run locally and removed the shiny app component—though honestly, it's not necessarily an improvement. I decided not to submit a pull request because it's probably not worth it as is, but it could serve as a starting point for anyone looking to set up a similar project using a local model.

See here: https://github.com/hp0404/askalex

@trangdata
Copy link
Owner

Thank you for putting this together @hp0404! Would love this as a PR if you could keep the Shiny app. Otherwise, I will take a look at your changes and try to incorporate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants