-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama Support #237
Ollama Support #237
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for contributing to AgentOps! I left a few comments that may answer some of your questions. Let me know if you have any questions. I haven't tried your change yet, so I don't know exactly why you aren't seeing the data on the front-end.
It looks like ollama has a streaming mode. I don't know if you want to add support for that |
Yes, have added support for ollama.chat, ollama.chat with stream, ollama.Client.chat, ollama.AsyncClient.chat. |
Awesome! Thanks for making the changes. Sorry I wasn't clear enough about token cost! I'll test this again if you can remove tokencost |
️✅ There are no secrets present in this pull request anymore.If these secrets were true positive and are still valid, we highly recommend you to revoke them. 🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request. |
I just tested this and it works! Good work! 🎉 I noticed that prompt tokens weren't being counted, but I suspect that's something I need to fix on the API server. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome work! Just some nits and you're good to go!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🎉🎉🎉
📥 Pull Request
📘 Description
Add support for Ollama support by patching the ollama.chat function.
🔄 Related Issue (if applicable)
Ollama support #192
🎯 Goal
Add support for official Ollama python library.
🔍 Additional Context
Any extra information or context to help us understand the change?
🧪 Testing
This is a first draft. I'd like some feedback to understand if I'm missing something.
Also, I don't see the analytics on session drill-down view. I will have to check the frontend project as well to see if this is happening because ollama is an unknown event.Todo
Dependencies
AgentOps-AI/tokencost#49 - Ollama support in tokencost to count token from messageToken cost is calculated on server.