You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First off, great work on askalex! It's super helpful.
I was wondering if you could add support for running the model on localhost instead of relying on OpenAI. This would be awesome for those who prefer using open-source LLMs locally to save on API costs.
It would be cool if we could choose between OpenAI's API and a local model in the settings.
e.g.
I've modified the project to run locally and removed the shiny app component—though honestly, it's not necessarily an improvement. I decided not to submit a pull request because it's probably not worth it as is, but it could serve as a starting point for anyone looking to set up a similar project using a local model.
Thank you for putting this together @hp0404! Would love this as a PR if you could keep the Shiny app. Otherwise, I will take a look at your changes and try to incorporate.
Hi!
First off, great work on askalex! It's super helpful.
I was wondering if you could add support for running the model on localhost instead of relying on OpenAI. This would be awesome for those who prefer using open-source LLMs locally to save on API costs.
It would be cool if we could choose between OpenAI's API and a local model in the settings.
e.g.
Maybe you could also include some instructions on how to set up a local model?
Thanks for considering this!
The text was updated successfully, but these errors were encountered: