Skip to content

Commit

Permalink
Add ZhipuAI model wrappers (#181)
Browse files Browse the repository at this point in the history
---------

Co-authored-by: wenhao <[email protected]>
  • Loading branch information
garyzhang99 and wenhao authored May 6, 2024
1 parent 1c993f9 commit be8db4c
Show file tree
Hide file tree
Showing 13 changed files with 699 additions and 0 deletions.
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,8 @@ services and third-party model APIs.
| | Multimodal | [`DashScopeMultiModalWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/dashscope_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#dashscope-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/dashscope_multimodal_template.json) | qwen-vl-max, qwen-vl-chat-v1, qwen-audio-chat |
| Gemini API | Chat | [`GeminiChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#gemini-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/gemini_chat_template.json) | gemini-pro, ... |
| | Embedding | [`GeminiEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#gemini-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/gemini_embedding_template.json) | models/embedding-001, ... |
| ZhipuAI API | Chat | [`ZhipuAIChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#zhipu-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/zhipu_chat_template.json) | glm-4, ... |
| | Embedding | [`ZhipuAIEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#zhipu-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/zhipu_embedding_template.json) | embedding-2, ... |
| ollama | Chat | [`OllamaChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_chat_template.json) | llama3, llama2, Mistral, ... |
| | Embedding | [`OllamaEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_embedding_template.json) | llama2, Mistral, ... |
| | Generation | [`OllamaGenerationWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_generate_template.json) | llama2, Mistral, ... |
Expand Down Expand Up @@ -118,6 +120,7 @@ the following libraries.
- [Self-Organizing Conversation](./examples/conversation_self_organizing)
- [Basic Conversation with LangChain library](./examples/conversation_with_langchain)
- [Conversation with ReAct Agent](./examples/conversation_with_react_agent)
- [Conversation in Natural Language to Query SQL](./examples/conversation_nl2sql/)
- [Conversation with RAG Agent](./examples/conversation_with_RAG_agents)

- Game
Expand Down
2 changes: 2 additions & 0 deletions README_ZH.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,8 @@ AgentScope提供了一系列`ModelWrapper`来支持本地模型服务和第三
| | Multimodal | [`DashScopeMultiModalWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/dashscope_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#dashscope-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/dashscope_multimodal_template.json) | qwen-vl-max, qwen-vl-chat-v1, qwen-audio-chat |
| Gemini API | Chat | [`GeminiChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#gemini-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/gemini_chat_template.json) | gemini-pro, ... |
| | Embedding | [`GeminiEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#gemini-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/gemini_embedding_template.json) | models/embedding-001, ... |
| ZhipuAI API | Chat | [`ZhipuAIChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#zhipu-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/zhipu_chat_template.json) | glm-4, ... |
| | Embedding | [`ZhipuAIEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#zhipu-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/zhipu_embedding_template.json) | embedding-2, ... |
| ollama | Chat | [`OllamaChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_chat_template.json) | llama3, llama2, Mistral, ... |
| | Embedding | [`OllamaEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_embedding_template.json) | llama2, Mistral, ... |
| | Generation | [`OllamaGenerationWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | [guidance](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#ollama-api) <br> [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_generate_template.json) | llama2, Mistral, ... |
Expand Down
45 changes: 45 additions & 0 deletions docs/sphinx_doc/en/source/tutorial/203-model.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ Currently, AgentScope supports the following model service APIs:
- OpenAI API, including chat, image generation (DALL-E), and Embedding.
- DashScope API, including chat, image sythesis and text embedding.
- Gemini API, including chat and embedding.
- ZhipuAI API, including chat and embedding.
- Ollama API, including chat, embedding and generation.
- Post Request API, model inference services based on Post
requests, including Huggingface/ModelScope Inference API and various
Expand Down Expand Up @@ -81,6 +82,8 @@ In the current AgentScope, the supported `model_type` types, the corresponding
| | Multimodal | [`DashScopeMultiModalWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/dashscope_model.py) | `"dashscope_multimodal"` | qwen-vl-plus, qwen-vl-max, qwen-audio-turbo, ... |
| Gemini API | Chat | [`GeminiChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py) | `"gemini_chat"` | gemini-pro, ... |
| | Embedding | [`GeminiEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py) | `"gemini_embedding"` | models/embedding-001, ... |
| ZhipuAI API | Chat | [`ZhipuAIChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py) | `"zhipuai_chat"` | glm4, ... |
| | Embedding | [`ZhipuAIEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py) | `"zhipuai_embedding"` | embedding-2, ... |
| ollama | Chat | [`OllamaChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | `"ollama_chat"` | llama2, ... |
| | Embedding | [`OllamaEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | `"ollama_embedding"` | llama2, ... |
| | Generation | [`OllamaGenerationWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | `"ollama_generate"` | llama2, ... |
Expand Down Expand Up @@ -303,6 +306,48 @@ Here we provide example configurations for different model wrappers.

<br/>


#### ZhipuAI API

<details>
<summary>ZhipuAI Chat API (<code><a href="https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py">agentscope.models.ZhipuAIChatWrapper</a></code>)</summary>

```python
{
"config_name": "my_zhipuai_chat_config",
"model_type": "zhipuai_chat",

# Required parameters
"model_name": "{model_name}", # The model name in ZhipuAI API, e.g. glm-4

# Optional parameters
"api_key": "{your_api_key}"
}
```

</details>

<details>
<summary>ZhipuAI Embedding API (<code><a href="https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py">agentscope.models.ZhipuAIEmbeddingWrapper</a></code>)</summary>

```python
{
"config_name": "my_zhipuai_embedding_config",
"model_type": "zhipuai_embedding",

# Required parameters
"model_name": "{model_name}", # The model name in ZhipuAI API, e.g. embedding-2

# Optional parameters
"api_key": "{your_api_key}",
}
```

</details>

<br/>


#### Ollama API

<details>
Expand Down
42 changes: 42 additions & 0 deletions docs/sphinx_doc/en/source/tutorial/206-prompt.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@ generation model APIs.
- [OllamaChatWrapper](#ollamachatwrapper)
- [OllamaGenerationWrapper](#ollamagenerationwrapper)
- [GeminiChatWrapper](#geminichatwrapper)
- [ZhipuAIChatWrapper](#zhipuaichatwrapper)

These strategies are implemented in the `format` functions of the model
wrapper classes.
Expand Down Expand Up @@ -365,6 +366,47 @@ print(prompt)
]
```

### `ZhipuAIChatWrapper`

`ZhipuAIChatWrapper` encapsulates the ZhipuAI chat API, which takes a list of messages as input. The message must obey the following rules:

- Require `role` and `content` fields, and `role` must be either `"user"`
`"system"` or `"assistant"`.
- There must be at least one `user` message.

#### Prompt Strategy

If the role field of the first message is `"system"`, it will be converted into a single message with the `role` field as `"system"` and the `content` field as the system message. The rest of the messages will be converted into a message with the `role` field as `"user"` and the `content` field as the dialogue history.

An example is shown below:

```python
from agentscope.models import ZhipuAIChatWrapper
from agentscope.message import Msg

model = ZhipuAIChatWrapper(
config_name="", # empty since we directly initialize the model wrapper
model_name="glm-4",
api_key="your api key",
)

prompt = model.format(
Msg("system", "You're a helpful assistant", role="system"), # Msg object
[ # a list of Msg objects
Msg(name="Bob", content="Hi!", role="assistant"),
Msg(name="Alice", content="Nice to meet you!", role="assistant"),
],
)
print(prompt)
```

```bash
[
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "## Dialogue History\nBob: Hi!\nAlice: Nice to meet you!"},
]
```

## Prompt Engine (Will be deprecated in the future)

AgentScope provides the `PromptEngine` class to simplify the process of crafting
Expand Down
45 changes: 45 additions & 0 deletions docs/sphinx_doc/zh_CN/source/tutorial/203-model.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ AgentScope中,模型的部署和调用是通过`ModelWrapper`来解耦开的
- OpenAI API,包括对话(Chat),图片生成(DALL-E)和文本嵌入(Embedding)。
- DashScope API,包括对话(Chat)和图片生成(Image Sythesis)和文本嵌入(Text Embedding)。
- Gemini API,包括对话(Chat)和嵌入(Embedding)。
- ZhipuAi API,包括对话(Chat)和嵌入(Embedding)。
- Ollama API,包括对话(Chat),嵌入(Embedding)和生成(Generation)。
- Post请求API,基于Post请求实现的模型推理服务,包括Huggingface/ModelScope
Inference API和各种符合Post请求格式的API。
Expand Down Expand Up @@ -101,6 +102,8 @@ API如下:
| | Multimodal | [`DashScopeMultiModalWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/dashscope_model.py) | `"dashscope_multimodal"` | qwen-vl-plus, qwen-vl-max, qwen-audio-turbo, ... |
| Gemini API | Chat | [`GeminiChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py) | `"gemini_chat"` | gemini-pro, ... |
| | Embedding | [`GeminiEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py) | `"gemini_embedding"` | models/embedding-001, ... |
| ZhipuAI API | Chat | [`ZhipuAIChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py) | `"zhipuai_chat"` | glm-4, ... |
| | Embedding | [`ZhipuAIEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py) | `"zhipuai_embedding"` | embedding-2, ... |
| ollama | Chat | [`OllamaChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | `"ollama_chat"` | llama2, ... |
| | Embedding | [`OllamaEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | `"ollama_embedding"` | llama2, ... |
| | Generation | [`OllamaGenerationWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py) | `"ollama_generate"` | llama2, ... |
Expand Down Expand Up @@ -323,6 +326,48 @@ API如下:

<br/>


#### ZhipuAI API

<details>
<summary>ZhipuAI Chat API (<code><a href="https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py">agentscope.models.ZhipuAIChatWrapper</a></code>)</summary>

```python
{
"config_name": "my_zhipuai_chat_config",
"model_type": "zhipuai_chat",

# Required parameters
"model_name": "{model_name}", # The model name in ZhipuAI API, e.g. glm-4

# Optional parameters
"api_key": "{your_api_key}"
}
```

</details>

<details>
<summary>ZhipuAI Embedding API (<code><a href="https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py">agentscope.models.ZhipuAIEmbeddingWrapper</a></code>)</summary>

```python
{
"config_name": "my_zhipuai_embedding_config",
"model_type": "zhipuai_embedding",

# Required parameters
"model_name": "{model_name}", # The model name in ZhipuAI API, e.g. embedding-2

# Optional parameters
"api_key": "{your_api_key}",
}
```

</details>

<br/>


#### Ollama API

<details>
Expand Down
44 changes: 44 additions & 0 deletions docs/sphinx_doc/zh_CN/source/tutorial/206-prompt.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ AgentScope为以下的模型API提供了内置的提示构建策略。
- [OllamaChatWrapper](#ollamachatwrapper)
- [OllamaGenerationWrapper](#ollamagenerationwrapper)
- [GeminiChatWrapper](#geminichatwrapper)
- [ZhipuAIChatWrapper](#zhipuaichatwrapper)

这些策略是在对应Model Wrapper类的`format`函数中实现的。它接受`Msg`对象,`Msg`对象的列表或它们的混合作为输入。在`format`函数将会把输入重新组织成一个`Msg`对象的列表,因此为了方便解释,我们在下面的章节中认为`format`函数的输入是`Msg`对象的列表。

Expand Down Expand Up @@ -326,6 +327,49 @@ print(prompt)
]
```


### `ZhipuAIChatWrapper`

`ZhipuAIChatWrapper`封装了ZhipuAi聊天API,它接受消息列表或字符串提示作为输入。与DashScope聊天API类似,如果我们传递消息列表,它必须遵守以下规则:

- 必须有 role 和 content 字段,且 role 必须是 "user"、"system" 或 "assistant" 中的一个。
- 至少有一个 user 消息。

当代理可能扮演多种不同角色并连续发言时,这些要求使得构建多代理对话变得困难。
因此,我们决定在内置的`format`函数中将消息列表转换为字符串提示,并且封装在一条user信息中。

#### 提示的构建策略

如果第一条消息的 role 字段是 "system",它将被转换为带有 role 字段为 "system" 和 content 字段为系统消息的单个消息。其余的消息会被转化为带有 role 字段为 "user" 和 content 字段为对话历史的消息。
下面展示了一个示例:

```python
from agentscope.models import ZhipuAIChatWrapper
from agentscope.message import Msg

model = ZhipuAIChatWrapper(
config_name="", # empty since we directly initialize the model wrapper
model_name="glm-4",
api_key="your api key",
)

prompt = model.format(
Msg("system", "You're a helpful assistant", role="system"), # Msg object
[ # a list of Msg objects
Msg(name="Bob", content="Hi!", role="assistant"),
Msg(name="Alice", content="Nice to meet you!", role="assistant"),
],
)
print(prompt)
```

```bash
[
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "## Dialogue History\nBob: Hi!\nAlice: Nice to meet you!"},
]
```

## 关于`PromptEngine`类 (将会在未来版本弃用)

`PromptEngine`类提供了一种结构化的方式来合并不同的提示组件,比如指令、提示、对话历史和用户输入,以适合底层语言模型的格式。
Expand Down
7 changes: 7 additions & 0 deletions examples/model_configs_template/zhipu_chat_template.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
[{
"config_name": "zhipuai_chat-glm",
"model_type": "zhipuai_chat",
"model_name": "glm-4",
"api_key": "{your_api_key}"
}
]
Loading

0 comments on commit be8db4c

Please sign in to comment.