Skip to content

Commit

Permalink
remove api_key and api_key_name params, and refer to docs
Browse files Browse the repository at this point in the history
  • Loading branch information
garyzhang99 committed May 10, 2024
1 parent 2617782 commit 60bea32
Show file tree
Hide file tree
Showing 6 changed files with 33 additions and 26 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ services and third-party model APIs.
| ollama | Chat | [`OllamaChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_chat_template.json) | llama3, llama2, Mistral, ... |
| | Embedding | [`OllamaEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_embedding_template.json) | llama2, Mistral, ... |
| | Generation | [`OllamaGenerationWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_generate_template.json) | llama2, Mistral, ... |
| LiteLLM API | Chat | [`LiteLLMChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/litellm_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#litellm-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/litellm_chat_template.json) | ... |
| LiteLLM API | Chat | [`LiteLLMChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/litellm_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#litellm-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/litellm_chat_template.json) | [models supported by litellm](https://docs.litellm.ai/docs/)... |
| Post Request based API | - | [`PostAPIModelWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/post_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#post-request-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/postapi_model_config_template.json) | - |

**Supported Local Model Deployment**
Expand Down
2 changes: 1 addition & 1 deletion README_ZH.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ AgentScope提供了一系列`ModelWrapper`来支持本地模型服务和第三
| ollama | Chat | [`OllamaChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_chat_template.json) | llama3, llama2, Mistral, ... |
| | Embedding | [`OllamaEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_embedding_template.json) | llama2, Mistral, ... |
| | Generation | [`OllamaGenerationWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_generate_template.json) | llama2, Mistral, ... |
| LiteLLM API | Chat | [`LiteLLMChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/litellm_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#litellm-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/litellm_chat_template.json) | ... |
| LiteLLM API | Chat | [`LiteLLMChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/litellm_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#litellm-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/litellm_chat_template.json) | [models supported by litellm](https://docs.litellm.ai/docs/)... |
| Post Request based API | - | [`PostAPIModelWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/post_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#post-request-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/postapi_model_config_template.json) | - |

**支持的本地模型部署**
Expand Down
4 changes: 1 addition & 3 deletions docs/sphinx_doc/en/source/tutorial/203-model.md
Original file line number Diff line number Diff line change
Expand Up @@ -453,9 +453,7 @@ com/modelscope/agentscope/blob/main/src/agentscope/models/litellm_model.py">agen
{
"config_name": "lite_llm_openai_chat_gpt-3.5-turbo",
"model_type": "litellm_chat",
"model_name": "gpt-3.5-turbo",
"api_key": "{your_api_key}",
"api_key_name": "OPENAI_API_KEY"
"model_name": "gpt-3.5-turbo" # You should note that for different models, you should set the corresponding environment variables, such as OPENAI_API_KEY, etc. You may refer to https://docs.litellm.ai/docs/ for this.
},
```

Expand Down
4 changes: 1 addition & 3 deletions docs/sphinx_doc/zh_CN/source/tutorial/203-model.md
Original file line number Diff line number Diff line change
Expand Up @@ -448,9 +448,7 @@ com/modelscope/agentscope/blob/main/src/agentscope/models/litellm_model.py">agen
{
"config_name": "lite_llm_openai_chat_gpt-3.5-turbo",
"model_type": "litellm_chat",
"model_name": "gpt-3.5-turbo",
"api_key": "{your_api_key}",
"api_key_name": "OPENAI_API_KEY"
"model_name": "gpt-3.5-turbo" # You should note that for different models, you should set the corresponding environment variables, such as OPENAI_API_KEY, etc. You may refer to https://docs.litellm.ai/docs/ for this.
},
```

Expand Down
8 changes: 2 additions & 6 deletions examples/model_configs_template/litellm_chat_template.json
Original file line number Diff line number Diff line change
@@ -1,15 +1,11 @@
[{
"config_name": "lite_llm_openai_chat_gpt-3.5-turbo",
"model_type": "litellm_chat",
"model_name": "gpt-3.5-turbo",
"api_key": "{your_api_key}",
"api_key_name": "OPENAI_API_KEY"
"model_name": "gpt-3.5-turbo"
},
{
"config_name": "lite_llm_claude3",
"model_type": "litellm_chat",
"model_name": "claude-3-opus-20240229",
"api_key": "{your_api_key}",
"api_key_name": "ANTHROPIC_API_KEY"
"model_name": "claude-3-opus-20240229"
}
]
39 changes: 27 additions & 12 deletions src/agentscope/models/litellm_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
"""Model wrapper based on litellm https://docs.litellm.ai/docs/"""
from abc import ABC
from typing import Union, Any, List, Sequence
import os

from loguru import logger

Expand All @@ -23,24 +22,32 @@ def __init__(
self,
config_name: str,
model_name: str = None,
api_key: str = None,
api_key_name: str = None,
generate_args: dict = None,
**kwargs: Any,
) -> None:
"""
To use the LiteLLM wrapper, environent variables must be set.
Different model_name could be using different environment variables.
For example:
- for model_name: "gpt-3.5-turbo", you need to set "OPENAI_API_KEY"
```
os.environ["OPENAI_API_KEY"] = "your-api-key"
```
- for model_name: "claude-2", you need to set "ANTHROPIC_API_KEY"
- for Azure OpenAI, you need to set "AZURE_API_KEY",
"AZURE_API_BASE", "AZURE_API_VERSION"
You should refer to the docs in https://docs.litellm.ai/docs/ .
Args:
config_name (`str`):
The name of the model config.
model_name (`str`, default `None`):
The name of the model to use in OpenAI API.
api_key (`str`, default `None`):
The API key used.
api_key_name (`str`, default `None`):
The API key name used, related to the model_name.
generate_args (`dict`, default `None`):
The extra keyword arguments used in litellm api generation,
e.g. `temperature`, `seed`.
For generate_args, please refer to
https://docs.litellm.ai/docs/completion/input
for more detailes.
"""

Expand All @@ -57,10 +64,6 @@ def __init__(

self.model_name = model_name
self.generate_args = generate_args or {}
self.api_key = api_key
self.api_key_name = api_key_name
if api_key is not None and api_key_name is not None:
os.environ[api_key_name] = api_key
self._register_default_metrics()

def format(
Expand All @@ -75,7 +78,19 @@ def format(


class LiteLLMChatWrapper(LiteLLMWrapperBase):
"""The model wrapper based on litellm chat API."""
"""The model wrapper based on litellm chat API.
To use the LiteLLM wrapper, environent variables must be set.
Different model_name could be using different environment variables.
For example:
- for model_name: "gpt-3.5-turbo", you need to set "OPENAI_API_KEY"
```
os.environ["OPENAI_API_KEY"] = "your-api-key"
```
- for model_name: "claude-2", you need to set "ANTHROPIC_API_KEY"
- for Azure OpenAI, you need to set "AZURE_API_KEY",
"AZURE_API_BASE", "AZURE_API_VERSION"
You should refer to the docs in https://docs.litellm.ai/docs/ .
"""

model_type: str = "litellm_chat"

Expand Down

0 comments on commit 60bea32

Please sign in to comment.