Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add Voyage AI support #575

Open
wants to merge 28 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
f065f50
feat: Add Voyage AI support (#461)
devin-ai-integration[bot] Dec 12, 2024
7cf6400
chore: migrate tach config from YAML to TOML and add voyage provider …
devin-ai-integration[bot] Dec 12, 2024
2701eaf
style: Apply ruff formatting
devin-ai-integration[bot] Dec 12, 2024
ae686a7
fix: Add ci dependency group to tach.toml
devin-ai-integration[bot] Dec 12, 2024
2255c54
fix: Use correct dependency-group format in tach.toml
devin-ai-integration[bot] Dec 12, 2024
bd25d3b
fix: Update tach.toml dependency configuration format
devin-ai-integration[bot] Dec 12, 2024
2f6a5f6
feat: Enhance Voyage AI provider with async support and improved erro…
devin-ai-integration[bot] Dec 12, 2024
c653de8
fix: Update tach.toml to use dependency-group format
devin-ai-integration[bot] Dec 12, 2024
11b83e2
fix: Remove dependency configuration from tach.toml (#461)
devin-ai-integration[bot] Dec 13, 2024
9288655
docs: Add Voyage AI integration example notebook (#461)
devin-ai-integration[bot] Dec 13, 2024
d623a4a
style: Apply ruff formatting (#461)
devin-ai-integration[bot] Dec 13, 2024
b2adfe9
fix: Update VoyageProvider to handle multiple response formats
devin-ai-integration[bot] Dec 14, 2024
f694d69
style: Apply ruff formatting to Voyage AI integration files
devin-ai-integration[bot] Dec 14, 2024
56f3fea
fix: Update test mocking and event data serialization
devin-ai-integration[bot] Dec 14, 2024
ad922f8
style: Apply ruff formatting fixes
devin-ai-integration[bot] Dec 14, 2024
7bc41e6
style: Apply ruff formatting
devin-ai-integration[bot] Dec 14, 2024
3a2d13e
fix: Update event data serialization and remove hardcoded API keys
devin-ai-integration[bot] Dec 14, 2024
9233e21
style: Apply ruff formatting
devin-ai-integration[bot] Dec 14, 2024
4965dc3
fix: Remove sensitive data and fix event serialization
devin-ai-integration[bot] Dec 14, 2024
92a6f24
fix: Remove hardcoded API keys from create_notebook.py
devin-ai-integration[bot] Dec 14, 2024
1168ff0
style: Apply ruff formatting to verify_output.py
devin-ai-integration[bot] Dec 14, 2024
420006a
style: Apply ruff formatting
devin-ai-integration[bot] Dec 14, 2024
8315987
style: Apply ruff formatting
devin-ai-integration[bot] Dec 14, 2024
033a29b
Merge branch 'main' into devin/1733984552-voyage-ai-support
the-praxs Dec 16, 2024
1322010
purge unnecessary files
the-praxs Dec 16, 2024
c3277d8
update voyage examples page
the-praxs Dec 16, 2024
413ee42
add voyage to docs
the-praxs Dec 16, 2024
ff5df30
restructure voyage examples and move test file to tests directory
the-praxs Dec 16, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
206 changes: 206 additions & 0 deletions agentops/llms/providers/voyage.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,206 @@
"""Voyage AI provider integration for AgentOps."""

import inspect
import warnings
import sys
import json
import pprint
import voyageai
from typing import Any, Dict, Optional, Callable
from agentops.llms.providers.instrumented_provider import InstrumentedProvider
from agentops.session import Session
from agentops.event import LLMEvent, ErrorEvent
from agentops.helpers import check_call_stack_for_agent_id, get_ISO_time
from agentops.log_config import logger
from agentops.singleton import singleton


def _check_python_version() -> None:
"""Check if the current Python version meets Voyage AI requirements."""
if sys.version_info < (3, 9):
warnings.warn(
"Voyage AI SDK requires Python >=3.9. Some functionality may not work correctly.",
UserWarning,
stacklevel=2,
)


@singleton
class VoyageProvider(InstrumentedProvider):
"""Provider for Voyage AI SDK integration.

Handles embedding operations and tracks usage through AgentOps.
Requires Python >=3.9 for full functionality.

Args:
client: Initialized Voyage AI client instance
"""

def __init__(self, client=None):
"""Initialize VoyageProvider with optional client."""
import sys
import warnings

if sys.version_info < (3, 9):
warnings.warn("Voyage AI requires Python >=3.9. Some functionality may not work correctly.", RuntimeWarning)

super().__init__(client or voyageai)
self._provider_name = "Voyage"
self._client = client or voyageai
self.original_embed = None
self.original_aembed = None
_check_python_version()

def embed(self, input_text: str, **kwargs) -> Dict[str, Any]:
"""Synchronous embed method."""
init_timestamp = get_ISO_time()
session = kwargs.pop("session", None) # Extract and remove session from kwargs

try:
# Call the patched function
response = self._client.embed(input_text, **kwargs)

# Handle response and create event
if session:
self.handle_response(
response, init_timestamp=init_timestamp, session=session, input_text=input_text, **kwargs
)

return response
except Exception as e:
if session:
self._safe_record(
session,
ErrorEvent(
exception=e,
trigger_event=LLMEvent(init_timestamp=init_timestamp, prompt="<redacted>", model="voyage-01"),
),
)
raise # Re-raise the exception without wrapping

async def aembed(self, input_text: str, **kwargs) -> Dict[str, Any]:
"""Asynchronous embed method."""
init_timestamp = get_ISO_time()
session = kwargs.pop("session", None) # Extract and remove session from kwargs

try:
# Call the patched function
response = await self._client.aembed(input_text, **kwargs)

# Handle response and create event
if session:
self.handle_response(
response, init_timestamp=init_timestamp, session=session, input_text=input_text, **kwargs
)

return response
except Exception as e:
if session:
self._safe_record(
session,
ErrorEvent(
exception=e,
trigger_event=LLMEvent(init_timestamp=init_timestamp, prompt="<redacted>", model="voyage-01"),
),
)
raise # Re-raise the exception without wrapping

def handle_response(
self,
response: Dict[str, Any],
init_timestamp: str = None,
session: Optional[Session] = None,
input_text: str = "",
**kwargs,
) -> None:
"""Handle the response from Voyage AI API and record event data.

Args:
response: The API response containing embedding data and usage information
init_timestamp: Optional timestamp for event initialization
session: Optional session for event recording
input_text: The original input text used for embedding
**kwargs: Additional keyword arguments from the original request
"""
if not session:
return

try:
# Extract usage information
usage = response.get("usage", {})
prompt_tokens = usage.get("input_tokens", 0)
completion_tokens = 0 # Embeddings don't have completion tokens

# Extract embedding data safely
embeddings = []
if "data" in response:
embeddings = [d.get("embedding", []) for d in response.get("data", [])]
elif "embeddings" in response:
embeddings = response.get("embeddings", [])

# Create LLM event with correct format
event = LLMEvent(
init_timestamp=init_timestamp or get_ISO_time(),
end_timestamp=get_ISO_time(),
model=response.get("model", "voyage-01"),
prompt=input_text,
prompt_tokens=prompt_tokens,
completion={"type": "embedding", "vector": embeddings[0] if embeddings else []},
completion_tokens=completion_tokens,
cost=0.0, # Voyage AI doesn't provide cost information
params={"input_text": input_text},
returns={"usage": usage, "model": response.get("model", "voyage-01"), "data": response.get("data", [])},
)

session.record(event)
except Exception as e:
error_event = ErrorEvent(
exception=e,
trigger_event=LLMEvent(
init_timestamp=init_timestamp or get_ISO_time(), prompt="<redacted>", model="voyage-01"
),
)
self._safe_record(session, error_event)
logger.warning("Unable to process embedding response")

def override(self):
"""Override the original SDK methods with instrumented versions."""
self._override_sync_embed()
self._override_async_embed()

def _override_sync_embed(self):
"""Override synchronous embed method."""
# Store the original method
self.original_embed = self._client.__class__.embed

def patched_embed(client_self, input_text: str, **kwargs):
"""Sync patched embed method."""
try:
return self.original_embed(client_self, input_text, **kwargs)
except Exception as e:
raise # Re-raise without wrapping

# Override method with instrumented version
self._client.__class__.embed = patched_embed

def _override_async_embed(self):
"""Override asynchronous embed method."""
# Store the original method
self.original_aembed = self._client.__class__.aembed

async def patched_embed_async(client_self, input_text: str, **kwargs):
"""Async patched embed method."""
try:
return await self.original_aembed(client_self, input_text, **kwargs)
except Exception as e:
raise # Re-raise without wrapping

# Override method with instrumented version
self._client.__class__.aembed = patched_embed_async

def undo_override(self):
"""Restore the original SDK methods."""
if self.original_embed is not None:
self._client.__class__.embed = self.original_embed
if self.original_aembed is not None:
self._client.__class__.aembed = self.original_aembed
1 change: 1 addition & 0 deletions docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,7 @@
"v1/integrations/ollama",
"v1/integrations/openai",
"v1/integrations/rest",
"v1/integrations/voyage",
"v1/integrations/xai"
]
},
Expand Down
13 changes: 9 additions & 4 deletions docs/v1/examples/examples.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,10 @@ mode: "wide"
AutoGen multi-agent conversible workflow with tool usage
</Card>

<Card title="CamelAI" icon={<img src="https://www.github.com/agentops-ai/agentops/blob/main/docs/images/external/camel/camel.png?raw=true" alt="CamelAI" />} iconType="image" href="/v1/examples/camel">
Track and analyze CAMEL agents including LLM and Tool usage
</Card>

<Card title="Cohere" icon={<img src="https://www.github.com/agentops-ai/agentops/blob/main/docs/images/external/cohere/cohere-logo.svg?raw=true" alt="Cohere" />} iconType="image" href="/v1/integrations/cohere">
First class support for Command-R-Plus and chat streaming
</Card>
Expand Down Expand Up @@ -77,17 +81,18 @@ mode: "wide"
First class support for GPT family of models
</Card>

<Card title="CamelAI" icon={<img src="https://www.github.com/agentops-ai/agentops/blob/main/docs/images/external/camel/camel.png?raw=true" alt="CamelAI" />} iconType="image" href="/v1/examples/camel">
Track and analyze CAMEL agents including LLM and Tool usage
</Card>

<Card title="REST API" icon="bolt-lightning" href="/v1/examples/restapi">
Create a REST server that performs and observes agent tasks
</Card>

<Card title="Voyage AI" icon={<img src="https://www.github.com/agentops-ai/agentops/blob/main/docs/images/external/voyage/voyage-logo.png?raw=true" alt="Voyage AI" />} iconType="image" href="/v1/integrations/voyage">
High-performance embeddings with comprehensive usage tracking
</Card>

<Card title="xAI" icon={<img src="https://www.github.com/agentops-ai/agentops/blob/main/docs/images/external/xai/xai-logo.png?raw=true" alt="xAI" />} iconType="image" href="/v1/integrations/xai">
Observe the power of Grok and Grok Vision with AgentOps
</Card>

</CardGroup>

## Video Guides
Expand Down
104 changes: 104 additions & 0 deletions docs/v1/integrations/voyage.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
---
title: Voyage AI
description: "AgentOps provides first class support for Voyage AI's models"
---

import CodeTooltip from '/snippets/add-code-tooltip.mdx'
import EnvTooltip from '/snippets/add-env-tooltip.mdx'

[Voyage AI](https://voyageai.com) provides state-of-the-art embedding models. Explore their [documentation](https://docs.voyageai.com) to learn more.

## Steps to Integrate Voyage AI with AgentOps

<Steps>
<Step title="Install the AgentOps SDK">
<CodeGroup>
```bash pip
pip install agentops
```
```bash poetry
poetry add agentops
```
</CodeGroup>
</Step>
<Step title="Install the Voyage AI SDK">
<CodeGroup>
```bash pip
pip install voyageai
```
```bash poetry
poetry add voyageai
```
</CodeGroup>
</Step>
<Step title="Initialize AgentOps and develop with Voyage">
<CodeTooltip/>
<CodeGroup>
```python python
import voyageai
import agentops

agentops.init(<INSERT YOUR API KEY HERE>)
client = voyageai.Client(api_key="your_voyage_api_key")

# Your code here...

agentops.end_session('Success')
```
</CodeGroup>
<EnvTooltip />
<CodeGroup>
```python .env
AGENTOPS_API_KEY=<YOUR API KEY>
```
</CodeGroup>
Read more about environment variables in [Advanced Configuration](/v1/usage/advanced-configuration)
</Step>
</Steps>

## Full Examples

<CodeGroup>
```python sync
import voyageai
import agentops

agentops.init(<INSERT YOUR API KEY HERE>)
client = voyageai.Client(api_key="your_voyage_api_key")

# Create embeddings
embeddings = client.embed(
texts=["Hello world!", "Goodbye world!"],
model="voyage-large-2"
)

print(embeddings)
agentops.end_session('Success')
```

```python async
import voyageai
import agentops
import asyncio

async def main():
agentops.init(<INSERT YOUR API KEY HERE>)
client = voyageai.AsyncClient(api_key="your_voyage_api_key")

embeddings = await client.embed(
texts=["Hello world!", "Goodbye world!"],
model="voyage-large-2"
)

print(embeddings)
agentops.end_session('Success')

asyncio.run(main())
```
</CodeGroup>

<script type="module" src="/scripts/github_stars.js"></script>
<script type="module" src="/scripts/scroll-img-fadein-animation.js"></script>
<script type="module" src="/scripts/button_heartbeat_animation.js"></script>
<script type="css" src="/styles/styles.css"></script>
<script type="module" src="/scripts/adjust_api_dynamically.js"></script>
1 change: 0 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,6 @@ langchain = [
"langchain==0.2.14; python_version >= '3.8.1'"
]


[project.urls]
Homepage = "https://github.com/AgentOps-AI/agentops"
Issues = "https://github.com/AgentOps-AI/agentops/issues"
Expand Down
6 changes: 6 additions & 0 deletions tach.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
[modules]
path = "agentops"
depends_on = []

[dependency-group.ci]
tach = "~=0.9"
19 changes: 0 additions & 19 deletions tach.yml

This file was deleted.

Loading
Loading