Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama Support #237

Merged
merged 12 commits into from
Jun 19, 2024
Merged

Conversation

sprajosh
Copy link
Contributor

@sprajosh sprajosh commented Jun 5, 2024

📥 Pull Request

📘 Description
Add support for Ollama support by patching the ollama.chat function.

🔄 Related Issue (if applicable)
Ollama support #192

🎯 Goal
Add support for official Ollama python library.

🔍 Additional Context
Any extra information or context to help us understand the change?

🧪 Testing

import ollama
import agentops

AGENTOPS_API_KEY = "<api-key>"
agentops.init(AGENTOPS_API_KEY)

import ollama

# Sync chat
response = ollama.chat(
    model='orca-mini',
    messages=[{'role': 'user', 'content': 'Why is the sky blue?'}
])

# Async chat
response = ollama.chat(
    model="orca-mini",
    messages=[{"role": "user", "content": "Why is the sky blue?"}],
    stream=True,
)

for chunk in response:
    print(chunk)

agentops.end_session("Success")

This is a first draft. I'd like some feedback to understand if I'm missing something.
Also, I don't see the analytics on session drill-down view. I will have to check the frontend project as well to see if this is happening because ollama is an unknown event.

Todo

  • Ollama sync
  • Ollama sync with stream
  • Ollama sync client
  • Ollama async client
  • Count input and output tokens
  • Undo override for all
  • Minimum ollama version support

Dependencies
AgentOps-AI/tokencost#49 - Ollama support in tokencost to count token from message
Token cost is calculated on server.

Copy link
Contributor

@siyangqiu siyangqiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for contributing to AgentOps! I left a few comments that may answer some of your questions. Let me know if you have any questions. I haven't tried your change yet, so I don't know exactly why you aren't seeing the data on the front-end.

agentops/llm_tracker.py Outdated Show resolved Hide resolved
agentops/llm_tracker.py Outdated Show resolved Hide resolved
agentops/llm_tracker.py Outdated Show resolved Hide resolved
@siyangqiu
Copy link
Contributor

It looks like ollama has a streaming mode. I don't know if you want to add support for that

@sprajosh
Copy link
Contributor Author

sprajosh commented Jun 6, 2024

It looks like ollama has a streaming mode. I don't know if you want to add support for that

Yes, have added support for ollama.chat, ollama.chat with stream, ollama.Client.chat, ollama.AsyncClient.chat.

@siyangqiu
Copy link
Contributor

siyangqiu commented Jun 10, 2024

Awesome! Thanks for making the changes. Sorry I wasn't clear enough about token cost! I'll test this again if you can remove tokencost

Copy link

gitguardian bot commented Jun 11, 2024

️✅ There are no secrets present in this pull request anymore.

If these secrets were true positive and are still valid, we highly recommend you to revoke them.
Once a secret has been leaked into a git repository, you should consider it compromised, even if it was deleted immediately.
Find here more information about risks.


🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.

@siyangqiu
Copy link
Contributor

I just tested this and it works! Good work! 🎉

I noticed that prompt tokens weren't being counted, but I suspect that's something I need to fix on the API server.

Copy link
Contributor

@siyangqiu siyangqiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome work! Just some nits and you're good to go!

agentops/llm_tracker.py Show resolved Hide resolved
agentops/llm_tracker.py Outdated Show resolved Hide resolved
Copy link
Contributor

@siyangqiu siyangqiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🎉🎉🎉

@siyangqiu siyangqiu merged commit 4b86d9a into AgentOps-AI:main Jun 19, 2024
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants