Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama Support #15

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

CraigWA
Copy link

@CraigWA CraigWA commented Aug 1, 2024

hopefully you're okay with this approach. i added the ollama library so we need to add that to the dependencies list. Also if you could check and make sure that openai isnt broken by this i dont want to put money in XD

@MadcowD
Copy link
Owner

MadcowD commented Aug 1, 2024

Just some questions:

  1. Does ollama support streaming. It would be ideal that we don't have to bifurcate api calls.. Otherwise we should actually build an interface to move this code out of lm.py and make processing & adapting client responses per clienbt. (I am less of fan of this).?

  2. Maybe @chrisaddy knows how to do this but we could do something like

pip instal ell
pip install ell[ollama]

Which lets u specifically install ollama for ell so we can have modular dependencies :)

You guys should collaborate to to finish this PR @chrisaddy <3

@CraigWA
Copy link
Author

CraigWA commented Aug 1, 2024

Yes, from the docs:

Streaming responses
Response streaming can be enabled by setting stream=True, modifying function calls to return a Python generator where each part is an object in the stream.
stream = ollama.chat( model='llama3.1', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}], stream=True, )

And I like the idea of having it be specific packages you install, seperating and choosing specificly what functionality you want we should do the same thing with parsing.

Pip install ell-openai
Pip install ell-ollama
Pip install ell-claude
Pip install ell-parser
Pip install ell-all
For incase you wanted to juggle multiple providers at the same time

Although having them all be in the same package and just cleaning then swapping how the functionality is done could be good too

@MadcowD
Copy link
Owner

MadcowD commented Aug 3, 2024

kk :) will deal with this in a sec and do some mods <3

@WeldFire
Copy link

Was this something that is still going to be implemented?

I see the 'examples/ollama_example.py'; however, that example seems outdated and doesn't work for me even with the modifications that I believe this PR addresses.

Will gladly help if it is needed!

@rosmur
Copy link

rosmur commented Sep 19, 2024

Would it be simpler to use their openai compatible API? Or use a package like litellm?

@bharattrader
Copy link

Was this something that is still going to be implemented?

I see the 'examples/ollama_example.py'; however, that example seems outdated and doesn't work for me even with the modifications that I believe this PR addresses.

Will gladly help if it is needed!

Works likes this, #172 (comment)

@MadcowD
Copy link
Owner

MadcowD commented Sep 26, 2024

Okay I've decided to add official ollama support. This will be coming soon

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants