Skip to content

Commit

Permalink
separated llm tracker by model (#346)
Browse files Browse the repository at this point in the history
* separated llm tracker by model

* llm refactor progress

* working for openai

* groq provider

* cohere provider

* ollama provider

* remove support for openai v0

* cohere support

* test import fix

* test import fix

* groq test fix

* ollama tests

* litellm tests

* dont import litellm

* cohere fixes and tests

* oai version <0.1 better deprecation warning

* [FEATURE] Add Anthropic LLM support via `anthropic` Python SDK (#332)

* fix typo

* add anthropic support

* add example file

* fixed linting using black

* add time travel support for anthropic

* linting

* fix kwargs key to get prompt

* fix for extracting tokens from Message.usage

* minor fix for output tokens

* remove anthropic example python file

* fix completions not show in session

* some more fixes and cleanup

* add Message object to pydantic models

* fix typo

* overhaul anthropic code

* linting

* add anthropic example notebook

* linting

* added readme examples

* fix incorrect attribute access for the model content

* add async example

* refactor code

* fix function name

* linting

* add provider tests

---------

Co-authored-by: reibs <[email protected]>

* added undo to canaries

* added anthropic tests

* undo instrumenting for litellm

* cohere considerations

---------

Co-authored-by: Pratyush Shukla <[email protected]>
Co-authored-by: reibs <[email protected]>
  • Loading branch information
3 people authored Aug 20, 2024
1 parent 06657e1 commit caaacc3
Show file tree
Hide file tree
Showing 32 changed files with 2,259 additions and 1,110 deletions.
108 changes: 107 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -249,7 +249,113 @@ agentops.end_session('Success')
</details>


### LiteLLM
### Anthropic ﹨

Track agents built with the Anthropic Python SDK (>=0.32.0).

- [AgentOps integration example](./examples/anthropic/anthropic_example.ipynb)
- [Official Anthropic documentation](https://docs.anthropic.com/en/docs/welcome)

<details>
<summary>Installation</summary>

```bash
pip install anthropic
```

```python python
import anthropic
import agentops

# Beginning of program's code (i.e. main.py, __init__.py)
agentops.init(<INSERT YOUR API KEY HERE>)

client = anthropic.Anthropic(
# This is the default and can be omitted
api_key=os.environ.get("ANTHROPIC_API_KEY"),
)

message = client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": "Tell me a cool fact about AgentOps",
}
],
model="claude-3-opus-20240229",
)
print(message.content)

agentops.end_session('Success')
```

Streaming
```python python
import anthropic
import agentops

# Beginning of program's code (i.e. main.py, __init__.py)
agentops.init(<INSERT YOUR API KEY HERE>)

client = anthropic.Anthropic(
# This is the default and can be omitted
api_key=os.environ.get("ANTHROPIC_API_KEY"),
)

stream = client.messages.create(
max_tokens=1024,
model="claude-3-opus-20240229",
messages=[
{
"role": "user",
"content": "Tell me something cool about streaming agents",
}
],
stream=True,
)

response = ""
for event in stream:
if event.type == "content_block_delta":
response += event.delta.text
elif event.type == "message_stop":
print("\n")
print(response)
print("\n")
```

Async

```python python
import asyncio
from anthropic import AsyncAnthropic

client = AsyncAnthropic(
# This is the default and can be omitted
api_key=os.environ.get("ANTHROPIC_API_KEY"),
)


async def main() -> None:
message = await client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": "Tell me something interesting about async agents",
}
],
model="claude-3-opus-20240229",
)
print(message.content)


await main()
```
</details>

### LiteLLM 🚅

AgentOps provides support for LiteLLM(>=1.3.1), allowing you to call 100+ LLMs using the same Input/Output Format.

Expand Down
4 changes: 2 additions & 2 deletions agentops/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,15 +19,15 @@
from termcolor import colored

from .event import Event, ErrorEvent
from .helpers import (
from .singleton import (
conditional_singleton,
)
from .session import Session, active_sessions
from .host_env import get_host_env
from .log_config import logger
from .meta_client import MetaClient
from .config import Configuration
from .llm_tracker import LlmTracker
from .llms import LlmTracker


@conditional_singleton
Expand Down
2 changes: 1 addition & 1 deletion agentops/event.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ class LLMEvent(Event):
thread_id(UUID, optional): The unique identifier of the contextual thread that a message pertains to.
prompt(str, list, optional): The message or messages that were used to prompt the LLM. Preferably in ChatML format which is more fully supported by AgentOps.
prompt_tokens(int, optional): The number of tokens in the prompt message.
completion(str, object, optional): The message or returned by the LLM. Preferably in ChatML format which is more fully supported by AgentOps.
completion(str, object, optional): The message or messages returned by the LLM. Preferably in ChatML format which is more fully supported by AgentOps.
completion_tokens(int, optional): The number of tokens in the completion message.
model(str, optional): LLM model e.g. "gpt-4", "gpt-3.5-turbo".
Expand Down
33 changes: 0 additions & 33 deletions agentops/helpers.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
from pprint import pformat
from functools import wraps
from datetime import datetime, timezone
import json
import inspect
from typing import Union
import http.client
Expand All @@ -11,38 +10,6 @@
from .log_config import logger
from uuid import UUID
from importlib.metadata import version
import subprocess

ao_instances = {}


def singleton(class_):

def getinstance(*args, **kwargs):
if class_ not in ao_instances:
ao_instances[class_] = class_(*args, **kwargs)
return ao_instances[class_]

return getinstance


def conditional_singleton(class_):

def getinstance(*args, **kwargs):
use_singleton = kwargs.pop("use_singleton", True)
if use_singleton:
if class_ not in ao_instances:
ao_instances[class_] = class_(*args, **kwargs)
return ao_instances[class_]
else:
return class_(*args, **kwargs)

return getinstance


def clear_singletons():
global ao_instances
ao_instances = {}


def get_ISO_time():
Expand Down
Loading

0 comments on commit caaacc3

Please sign in to comment.