Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add context manager streaming support for Anthropic #595

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
a8c6be9
feat: implement context handler streaming for Anthropic provider
devin-ai-integration[bot] Dec 19, 2024
b3e6242
test: update Anthropic model version in canary tests
devin-ai-integration[bot] Dec 19, 2024
8f51d63
style: apply ruff formatting fixes
devin-ai-integration[bot] Dec 19, 2024
84d38e7
refactor: simplify Anthropic provider and implement proper streaming
devin-ai-integration[bot] Dec 19, 2024
c07c9fe
refactor: update AnthropicProvider to use InstrumentedProvider
devin-ai-integration[bot] Dec 19, 2024
d441948
fix: update Anthropic provider and examples to handle streaming chunk…
devin-ai-integration[bot] Dec 19, 2024
4e30f2e
feat: add StreamWrapper for proper context manager support
devin-ai-integration[bot] Dec 19, 2024
2b43aed
feat: add iteration support to StreamWrapper and update examples
devin-ai-integration[bot] Dec 19, 2024
4d3cd60
fix: handle coroutine in StreamWrapper async iteration
devin-ai-integration[bot] Dec 19, 2024
5dcaa85
fix: handle text chunks directly in async streaming
devin-ai-integration[bot] Dec 19, 2024
9284009
fix: update async example to handle text chunks directly
devin-ai-integration[bot] Dec 19, 2024
f1b957f
fix: use text_stream for async streaming in both provider and example
devin-ai-integration[bot] Dec 19, 2024
ab0a772
fix: add text_stream property to StreamWrapper
devin-ai-integration[bot] Dec 19, 2024
25c8675
feat: update example notebooks for async streaming
devin-ai-integration[bot] Dec 19, 2024
539eb63
fix: improve StreamWrapper async iteration handling
devin-ai-integration[bot] Dec 19, 2024
d579339
fix: update async streaming to use event-based iteration
devin-ai-integration[bot] Dec 19, 2024
6a177cf
fix: improve StreamWrapper async iteration and remove duplicate __ait…
devin-ai-integration[bot] Dec 19, 2024
1567252
fix: update examples to use text_stream property consistently
devin-ai-integration[bot] Dec 20, 2024
e370952
fix: update StreamWrapper event accumulation
devin-ai-integration[bot] Dec 20, 2024
bca1a44
fix: update StreamWrapper to handle both dict and object types
devin-ai-integration[bot] Dec 20, 2024
7caceec
fix: clean up AgentOps client configuration in async example
devin-ai-integration[bot] Dec 20, 2024
f7a64ff
fix: update Client initialization in async example
devin-ai-integration[bot] Dec 20, 2024
0e965c2
fix: update AnthropicProvider to properly handle async_client
devin-ai-integration[bot] Dec 20, 2024
1bfc88b
fix: update StreamWrapper and create_stream_async for proper async ha…
devin-ai-integration[bot] Dec 20, 2024
c434d3e
fix: update StreamWrapper for proper async context management and eve…
devin-ai-integration[bot] Dec 20, 2024
d405cb2
fix: remove redundant stream parameter in async streaming
devin-ai-integration[bot] Dec 20, 2024
8a58ad2
fix: add name attribute to AnthropicProvider
devin-ai-integration[bot] Dec 20, 2024
7271df6
fix: update session initialization in AnthropicProvider
devin-ai-integration[bot] Dec 20, 2024
8eaf24b
fix: update async streaming implementation and example
devin-ai-integration[bot] Dec 20, 2024
ae2ac38
fix: update StreamWrapper and AnthropicProvider for proper async stre…
devin-ai-integration[bot] Dec 20, 2024
ef196c0
fix: update StreamWrapper event initialization with proper attributes
devin-ai-integration[bot] Dec 20, 2024
4e4651a
fix: add proper text_stream initialization in StreamWrapper
devin-ai-integration[bot] Dec 20, 2024
f145b27
fix: update StreamWrapper to handle different message content structures
devin-ai-integration[bot] Dec 20, 2024
a354d97
fix: update StreamWrapper event handling and text accumulation
devin-ai-integration[bot] Dec 20, 2024
f2061b7
fix: update async example with proper session handling
devin-ai-integration[bot] Dec 20, 2024
e4e0640
fix: update both examples with proper session handling and event trac…
devin-ai-integration[bot] Dec 20, 2024
5037392
fix: update examples with proper AgentOps client initialization
devin-ai-integration[bot] Dec 20, 2024
9dd3ae4
fix: update examples to use provider streaming methods
devin-ai-integration[bot] Dec 20, 2024
cc945a8
fix: update AnthropicProvider streaming implementation
devin-ai-integration[bot] Dec 20, 2024
041d51f
Merge branch 'main' into devin/1734589134-anthropic-streaming-context
the-praxs Dec 24, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
638 changes: 330 additions & 308 deletions agentops/llms/providers/anthropic.py

Large diffs are not rendered by default.

319 changes: 101 additions & 218 deletions examples/anthropic_examples/anthropic-example-async.ipynb

Large diffs are not rendered by default.

122 changes: 122 additions & 0 deletions examples/anthropic_examples/anthropic-example-async.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,122 @@
#!/usr/bin/env python
# coding: utf-8

"""
Anthropic Async Example

Anthropic supports both sync and async streaming! This example demonstrates async streaming
with a program called "Titan Support Protocol." The program assigns a personality type
to a mech and generates messages based on the Titan's health status, while concurrently
generating verification UUIDs.
"""

# Import required libraries
import os
import asyncio
from dotenv import load_dotenv
import anthropic
from agentops import Client
from agentops.llms.providers.anthropic import AnthropicProvider
from agentops.session import EndState

# Setup environment and API keys
load_dotenv()

# Initialize clients with explicit API key
anthropic_client = anthropic.Client(api_key=os.getenv("ANTHROPIC_API_KEY"))

# Initialize AgentOps client
ao_client = Client()
ao_client.configure(api_key=os.getenv("AGENTOPS_API_KEY"), default_tags=["anthropic-async"])
ao_client.initialize()

"""
Titan Personalities:
- Legion: Relentless and heavy-hitting, embodies brute strength
- Northstar: Precise and agile sniper, excels in long-range combat
- Ronin: Swift and aggressive melee specialist, close-quarters combat expert
"""

# Define personality presets
TitanPersonality = [
"Legion is a relentless and heavy-hitting Titan that embodies brute strength and defensive firepower. He speaks bluntly.",
"Northstar is a precise and agile sniper that excels in long-range combat and flight. He speaks with an edge of coolness to him",
"Ronin is a swift and aggressive melee specialist who thrives on close-quarters hit-and-run tactics. He talks like a Samurai might.",
]

# Define health status presets
TitanHealth = [
"Fully functional",
"Slightly Damaged",
"Moderate Damage",
"Considerable Damage",
"Near Destruction",
]

# Generate random personality and health status
Personality = "Ronin is a swift and aggressive melee specialist who thrives on close-quarters hit-and-run tactics. He talks like a Samurai might."
Health = "Considerable Damage"


async def generate_message(provider, personality, health):
"""Generate a Titan status message using the Anthropic API."""
prompt = f"""You are a Titan from Titanfall. Your personality is: {personality}
Your current health status is: {health}

Generate a short status report (2-3 sentences) that reflects both your personality and current health status.
Keep the tone consistent with a military combat AI but influenced by your unique personality."""

try:
async with provider.create_stream_async(
max_tokens=1024,
model="claude-3-sonnet-20240229",
messages=[{"role": "user", "content": prompt}]
) as stream:
message = ""
async for text in stream:
print(text, end="", flush=True)
message += text
print() # Add newline after message
return message
except Exception as e:
print(f"Error generating message: {e}")
return "Error: Unable to generate Titan status report."


async def main():
"""Main function to run the Titan Support Protocol."""
print("Initializing Titan Support Protocol...\n")

# Initialize AgentOps client
ao_client = Client()
ao_client.initialize()
session = ao_client.start_session()

try:
# Initialize Anthropic provider
provider = AnthropicProvider(session=session)

# Define Titan personality and health status
personality = "Ronin is a swift and aggressive melee specialist who thrives on close-quarters hit-and-run tactics. He talks like a Samurai might."
health = "Considerable Damage"

print(f"Personality: {personality}")
print(f"Health Status: {health}")
print("\nCombat log incoming from encrypted area")

# Generate Titan status message
message = await generate_message(provider, personality, health)
print(f"\nTitan Status Report: {message}")

# End session with success status
session.end_session(end_state=EndState.SUCCESS)

except Exception as e:
print(f"Error in Titan Support Protocol: {e}")
session.end_session(end_state=EndState.FAIL)


if __name__ == "__main__":
# Run the main function using asyncio
asyncio.run(main())

113 changes: 113 additions & 0 deletions examples/anthropic_examples/anthropic-example-async.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
"""
Anthropic Async Example

Anthropic supports both sync and async streaming! This example demonstrates async streaming
with a program called "Titan Support Protocol." The program assigns a personality type
to a mech and generates messages based on the Titan's health status, while concurrently
generating verification UUIDs.
"""

# Import required libraries
from anthropic import Anthropic
import agentops
from dotenv import load_dotenv
import os
import random
import asyncio
import uuid

# Setup environment and API keys
load_dotenv()
ANTHROPIC_API_KEY = os.getenv("ANTHROPIC_API_KEY") or "<your_anthropic_key>"
AGENTOPS_API_KEY = os.getenv("AGENTOPS_API_KEY") or "<your_agentops_key>"

# Initialize Anthropic client and AgentOps session
client = Anthropic(api_key=ANTHROPIC_API_KEY)
agentops.init(AGENTOPS_API_KEY, default_tags=["anthropic-async"])

"""
Titan Personalities:
- Legion: Relentless and heavy-hitting, embodies brute strength
- Northstar: Precise and agile sniper, excels in long-range combat
- Ronin: Swift and aggressive melee specialist, close-quarters combat expert
"""

# Define personality presets
TitanPersonality = [
"Legion is a relentless and heavy-hitting Titan that embodies brute strength and defensive firepower. He speaks bluntly.",
"Northstar is a precise and agile sniper that excels in long-range combat and flight. He speaks with an edge of coolness to him",
"Ronin is a swift and aggressive melee specialist who thrives on close-quarters hit-and-run tactics. He talks like a Samurai might.",
]

# Define health status presets
TitanHealth = [
"Fully functional",
"Slightly Damaged",
"Moderate Damage",
"Considerable Damage",
"Near Destruction",
]

# Generate random personality and health status
Personality = random.choice(TitanPersonality)
Health = random.choice(TitanHealth)

async def generate_message():
"""Generate a Titan message using async context manager for streaming."""
async with client.messages.create(
max_tokens=1024,
model="claude-3-5-sonnet-20240620",
messages=[
{
"role": "user",
"content": "You are a Titan; a mech from Titanfall 2. Based on your titan's personality and status, generate a message for your pilot. If Near Destruction, make an all caps death message such as AVENGE ME or UNTIL NEXT TIME.",
},
{
"role": "assistant",
"content": "Personality: Legion is a relentless and heavy-hitting Titan that embodies brute strength and defensive firepower. He speaks bluntly. Status: Considerable Damage",
},
{
"role": "assistant",
"content": "Heavy damage detected. Reinforcements would be appreciated, but I can still fight.",
},
{
"role": "user",
"content": "You are a Titan; a mech from Titanfall 2. Based on your titan's personality and status, generate a message for your pilot. If Near Destruction, make an all caps death message such as AVENGE ME or UNTIL NEXT TIME.",
},
{
"role": "assistant",
"content": f"Personality: {Personality}. Status: {Health}",
},
],
stream=True,
) as stream:
message = ""
async for text in stream.text_stream:
message += text
return message

async def generate_uuids():
"""Generate 4 UUIDs for verification matrix."""
return [str(uuid.uuid4()) for _ in range(4)]

async def main():
"""Main function to run the Titan Support Protocol."""
print("Initializing Titan Support Protocol...\n")
print("Personality:", Personality)
print("Health Status:", Health)
print("\nCombat log incoming from encrypted area")

# Start both tasks concurrently
uuids, message = await asyncio.gather(generate_uuids(), generate_message())

print("\nVerification matrix activated:")
for u in uuids:
print(u)

print("\nTitan Message:", message)

if __name__ == "__main__":
# Run the main function using asyncio
asyncio.run(main())
# End the AgentOps session with success status
agentops.end_session("Success")
Loading
Loading