-
Notifications
You must be signed in to change notification settings - Fork 286
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use local LLM via Ollama #123
Comments
Done. |
It starts writing out the command but then cancels itself and says this: Request to OpenAI failed with status 404: { |
@Ajaymamtora This was fixed in #115 but a new version is not released yet. @steve8708 could you help us with that? |
published! |
so is this feature implemented? |
Is it possible to use a local LLM via Ollama. If, what's the setup and what's the requirement for which LLM I can use (guessing it has to use openai api syntax)?
The text was updated successfully, but these errors were encountered: