Skip to content

Commit

Permalink
litellm format doc
Browse files Browse the repository at this point in the history
  • Loading branch information
garyzhang99 committed May 17, 2024
1 parent 3ae6e13 commit 2d351f2
Show file tree
Hide file tree
Showing 2 changed files with 89 additions and 0 deletions.
49 changes: 49 additions & 0 deletions docs/sphinx_doc/en/source/tutorial/206-prompt.md
Original file line number Diff line number Diff line change
Expand Up @@ -300,6 +300,55 @@ print(prompt)
]
```


### LiteLLMChatWrapper

`LiteLLMChatWrapper` encapsulates the litellm chat API, which takes a list of
messages as input. The litellm support different types of models, and each model
might need to obey different formats. To simplify the usage, we provide a format
that could be compatible with most models. If more specifical formats are needed,
you can refer to the specifical model you use as weel as the
[litellm](https://github.com/BerriAI/litellm) documentation to customize your
own format function for your model.

- format all the messages in the chat history, into a single message with `"user"` as `role`

#### Prompt Strategy

- Messages will consist dialogue history in the `user` message prefixed by the system message and "## Dialogue History".

```python
from agentscope.models import LiteLLMChatWrapper

model = LiteLLMChatWrapper(
config_name="", # empty since we directly initialize the model wrapper
model_name="gpt-3.5-turbo",
)

prompt = model.format(
Msg("system", "You are a helpful assistant", role="system"),
[
Msg("user", "What is the weather today?", role="user"),
Msg("assistant", "It is sunny today", role="assistant"),
],
)

print(prompt)
```

```bash
[
{
"role": "user",
"content": (
"You are a helpful assistant\n\n"
"## Dialogue History\nuser: What is the weather today?\n"
"assistant: It is sunny today"
),
},
]
```

### OllamaChatWrapper

`OllamaChatWrapper` encapsulates the Ollama chat API, which takes a list of
Expand Down
40 changes: 40 additions & 0 deletions docs/sphinx_doc/zh_CN/source/tutorial/206-prompt.md
Original file line number Diff line number Diff line change
Expand Up @@ -271,6 +271,46 @@ print(prompt)
]
```

### LiteLLMChatWrapper

`LiteLLMChatWrapper`封装了litellm聊天API,它接受消息列表作为输入。Litellm支持不同类型的模型,每个模型可能需要遵守不同的格式。为了简化使用,我们提供了一种与大多数模型兼容的格式。如果需要更特定的格式,您可以参考您所使用的特定模型以及[litellm](https://github.com/BerriAI/litellm)文档,来定制适合您模型的格式函数。
- 格式化聊天历史中的所有消息,将其整合成一个以`"user"`作为`role`的单一消息
#### 提示策略
- 消息将包括对话历史,`user`消息由系统消息(system message)和"## Dialog History"前缀。


```python
from agentscope.models import LiteLLMChatWrapper

model = LiteLLMChatWrapper(
config_name="", # empty since we directly initialize the model wrapper
model_name="gpt-3.5-turbo",
)

prompt = model.format(
Msg("system", "You are a helpful assistant", role="system"),
[
Msg("user", "What is the weather today?", role="user"),
Msg("assistant", "It is sunny today", role="assistant"),
],
)

print(prompt)
```

```bash
[
{
"role": "user",
"content": (
"You are a helpful assistant\n\n"
"## Dialogue History\nuser: What is the weather today?\n"
"assistant: It is sunny today"
),
},
]
```

### `OllamaChatWrapper`

`OllamaChatWrapper`封装了Ollama聊天API,它接受消息列表作为输入。消息必须遵守以下规则(更新于2024/03/22):
Expand Down

0 comments on commit 2d351f2

Please sign in to comment.