-
Notifications
You must be signed in to change notification settings - Fork 315
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama Support #15
base: main
Are you sure you want to change the base?
Ollama Support #15
Conversation
Just some questions:
Which lets u specifically install You guys should collaborate to to finish this PR @chrisaddy <3 |
Yes, from the docs: Streaming responses And I like the idea of having it be specific packages you install, seperating and choosing specificly what functionality you want we should do the same thing with parsing. Pip install ell-openai Although having them all be in the same package and just cleaning then swapping how the functionality is done could be good too |
kk :) will deal with this in a sec and do some mods <3 |
Was this something that is still going to be implemented? I see the 'examples/ollama_example.py'; however, that example seems outdated and doesn't work for me even with the modifications that I believe this PR addresses. Will gladly help if it is needed! |
Would it be simpler to use their openai compatible API? Or use a package like litellm? |
Works likes this, #172 (comment) |
Okay I've decided to add official ollama support. This will be coming soon |
hopefully you're okay with this approach. i added the ollama library so we need to add that to the dependencies list. Also if you could check and make sure that openai isnt broken by this i dont want to put money in XD