From 43c94514653ac508610eb604905fcedf36a2f5d5 Mon Sep 17 00:00:00 2001 From: zyzhang1130 <36942574+zyzhang1130@users.noreply.github.com> Date: Mon, 6 May 2024 11:06:11 +0800 Subject: [PATCH 1/5] updated conversation_with_mentions README according to 0_python_example_template --- examples/conversation_with_mentions/README.md | 75 ++++++------------- 1 file changed, 21 insertions(+), 54 deletions(-) diff --git a/examples/conversation_with_mentions/README.md b/examples/conversation_with_mentions/README.md index 6359b3413..2cb12ea52 100644 --- a/examples/conversation_with_mentions/README.md +++ b/examples/conversation_with_mentions/README.md @@ -1,73 +1,40 @@ +### # Multi-Agent Group Conversation in AgentScope -This example demonstrates a multi-agent group conversation facilitated by AgentScope. The script `main.py` sets up a virtual chat room where a user agent interacts with several NPC (non-player character) agents. The chat utilizes a special **"@"** mention functionality, which allows participants to address specific agents and have a more directed conversation. +This example demonstrates a multi-agent group conversation facilitated by AgentScope. The script sets up a virtual chat room where a user agent interacts with several NPC (non-player character) agents. Participants can utilize a special "@" mention functionality to address specific agents directly. -## Key Features +## Background -- **Real-time Group Conversation**: Engage in a chat with multiple agents responding in real time. -- **@ Mention Functionality**: Use the "@" symbol followed by an agent's name to specifically address that agent within the conversation. -- **Dynamic Flow**: User-driven conversation with agents responding based on the context and mentions. -- **Configurable Agent Roles**: Easily modify agent roles and behaviors by editing the `sys_prompt` in the configuration files. -- **User Timeout**: If the user does not respond within a specified time, the conversation continues with the next agent. +The conversation takes place in a simulated chat room environment with predefined roles for each participant. Topics are open-ended and evolve based on the user's input and agents' responses. -## How to Use - -To start the group conversation, follow these steps: - -1. Make sure to set your `api_key` in the `configs/model_configs.json` file. -2. Run the script using the following command: +## Tested Models -```bash -python main.py +These models are tested in this example. For other models, some modifications may be needed. +- gemini_chat (models/gemini-pro, models/gemini-1.0-pro) +- dashscope_chat (qwen-max, qwen-turbo) +- ollama_chat (ollama_llama3_8b) -# or launch agentscope studio -as_studio main.py -``` +## Prerequisites -1. To address a specific agent in the chat, type "@" followed by the agent's name in your message. -2. To exit the chat, simply type "exit" when it's your turn to speak. +Fill the next cell to meet the following requirements: +- Set your `api_key` in the `configs/model_configs.json` file +- Optional: Launch agentscope studio with `as_studio main.py` -## Background and Conversation Flow - -The conversation takes place in a simulated chat room environment with roles defined for each participant. The user acts as a regular chat member with the ability to speak freely and address any agent. NPC agents are pre-configured with specific roles that determine their responses and behavior in the chat. The topic of the conversation is open-ended and can evolve organically based on the user's input and agents' programmed personas. - -### Example Interaction +## How to Use -``` -User input: Hi, everyone! I'm excited to join this chat. -AgentA: Welcome! We're glad to have you here. -User input: @AgentB, what do you think about the new technology trends? -AgentB: It's an exciting time for tech! There are so many innovations on the horizon. -... -``` +1. Run the script using the command: `python main.py` +2. Address specific agents by typing "@" followed by the agent's name. +3. Type "exit" to leave the chat. ## Customization Options -The group conversation script provides several options for customization, allowing you to tailor the chat experience to your preferences. - -You can customize the conversation by editing the agent configurations and model parameters. The `agent_configs.json` file allows you to set specific behaviors for each NPC agent, while `model_configs.json` contains the parameters for the conversation model. +You can adjust the behavior and parameters of the NPC agents and conversation model by editing the `agent_configs.json` and `model_configs.json` files, respectively. ### Changing User Input Time Limit -The `USER_TIME_TO_SPEAK` variable sets the time limit (in seconds) for the user to input their message during each round. By default, this is set to 10 seconds. You can adjust this time limit by modifying the value of `USER_TIME_TO_SPEAK` in the `main.py` script. - -For example, to change the time limit to 20 seconds, update the line in `main.py` as follows: - -``` -USER_TIME_TO_SPEAK = 20 # User has 20 seconds to type their message -``` +Adjust the `USER_TIME_TO_SPEAK` variable in the `main.py` script to change the time limit for user input. ### Setting a Default Topic for the Chat Room -The `DEFAULT_TOPIC` variable defines the initial message or topic of the chat room. It sets the stage for the conversation and is announced at the beginning of the chat session. You can change this message to prompt a specific discussion topic or to provide instructions to the agents. - -To customize this message, modify the `DEFAULT_TOPIC` variable in the `main.py` script. For instance, if you want to set the default topic to discuss "The Future of Artificial Intelligence," you would change the code as follows: - -```python -DEFAULT_TOPIC = """ -This is a chat room about the Future of Artificial Intelligence and you can -speak freely and briefly. -""" -``` - -With these customizations, the chat room can be tailored to fit specific themes or time constraints, enhancing the user's control over the chat experience. +Modify the `DEFAULT_TOPIC` variable in the `main.py` script to set the initial topic of the chat room. +### \ No newline at end of file From 90c006c078f5b3f52453a51dabbd402d9bbc99c0 Mon Sep 17 00:00:00 2001 From: zyzhang1130 <36942574+zyzhang1130@users.noreply.github.com> Date: Thu, 9 May 2024 14:32:17 +0800 Subject: [PATCH 2/5] Update README.md for `conversation_basic` Update README.md for `conversation_basic` according to `0_python_example_template` --- examples/conversation_basic/README.md | 12 ++++++++++-- 1 file changed, 10 insertions(+), 2 deletions(-) diff --git a/examples/conversation_basic/README.md b/examples/conversation_basic/README.md index 1bdd093a2..e5ee3e320 100644 --- a/examples/conversation_basic/README.md +++ b/examples/conversation_basic/README.md @@ -1,5 +1,6 @@ # Multi-Agent Conversation in AgentScope -This is a demo of how to program a multi-agent conversation in AgentScope. + +This example will show how to program a multi-agent conversation in AgentScope. Complete code is in `conversation.py`, which set up a user agent and an assistant agent to have a conversation. When user input "exit", the conversation ends. @@ -8,5 +9,12 @@ You can modify the `sys_prompt` to change the role of assistant agent. # Note: Set your api_key in conversation.py first python conversation.py ``` +## Tested Models + +These models are tested in this example. For other models, some modifications may be needed. +- dashscope_chat(qwen-max) +- ollama_chat (ollama_llama3_8b) + +## Prerequisites To set up model serving with open-source LLMs, follow the guidance in -[scripts/REAMDE.md](../../scripts/README.md). \ No newline at end of file +[scripts/REAMDE.md](../../scripts/README.md). From a4cf58607936899edb30855c707226956fadb9fb Mon Sep 17 00:00:00 2001 From: zyzhang1130 <36942574+zyzhang1130@users.noreply.github.com> Date: Thu, 9 May 2024 21:03:34 +0800 Subject: [PATCH 3/5] Update README.md updated tested models for `conversation_basic` --- examples/conversation_basic/README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/examples/conversation_basic/README.md b/examples/conversation_basic/README.md index e5ee3e320..eb89720a0 100644 --- a/examples/conversation_basic/README.md +++ b/examples/conversation_basic/README.md @@ -12,8 +12,9 @@ python conversation.py ## Tested Models These models are tested in this example. For other models, some modifications may be needed. -- dashscope_chat(qwen-max) +- dashscope_chat (qwen-max) - ollama_chat (ollama_llama3_8b) +- gemini_chat (models/gemini-pro) ## Prerequisites To set up model serving with open-source LLMs, follow the guidance in From c0f87f35a78865a17573ac0e2ab4edbaec36c414 Mon Sep 17 00:00:00 2001 From: zyzhang1130 <36942574+zyzhang1130@users.noreply.github.com> Date: Fri, 10 May 2024 11:16:47 +0800 Subject: [PATCH 4/5] Create README.md created README and tested models for `conversation_self_organizing` example --- .../conversation_self_organizing/README.md | 29 +++++++++++++++++++ 1 file changed, 29 insertions(+) create mode 100644 examples/conversation_self_organizing/README.md diff --git a/examples/conversation_self_organizing/README.md b/examples/conversation_self_organizing/README.md new file mode 100644 index 000000000..0c1fd6d45 --- /dev/null +++ b/examples/conversation_self_organizing/README.md @@ -0,0 +1,29 @@ +# Self-Organizing Conversation Example + +This example will show +- How to set up a self-organizing conversation using the `DialogAgent` and `agent_builder` +- How to extract the discussion scenario and participant agents from the `agent_builder`'s response +- How to conduct a multi-round discussion among the participant agents + + +## Background + +In this example, we demonstrate how to create a self-organizing conversation where the `agent_builder` automatically sets up the agents participating in the discussion based on a given question. The `agent_builder` provides the discussion scenario and the characteristics of the participant agents. The participant agents then engage in a multi-round discussion to solve the given question. + + +## Tested Models + +These models are tested in this example. For other models, some modifications may be needed. +- `dashscope_chat` with `qwen-turbo` +- `ollama_chat` with `llama3_8b` +- `gemini_chat` with `models/gemini-1.0-pro-latest` + + +## Prerequisites + +Fill the next cell to meet the following requirements +- Set up the `model_configs` with the appropriate API keys and endpoints +- Provide the path to the `agent_builder_instruct.txt` file in the `load_txt` function +- Set the desired `max_round` for the discussion +- Provide the `query` or question for the discussion +- [Optional] Adjust the `generate_args` such as `temperature` for the `openai_chat` model \ No newline at end of file From b84d618d98d5950e32c2932dda6524acca2be5dc Mon Sep 17 00:00:00 2001 From: zyzhang1130 <36942574+zyzhang1130@users.noreply.github.com> Date: Fri, 10 May 2024 15:50:31 +0800 Subject: [PATCH 5/5] Update conversation_with_mentions/README.md --- examples/conversation_with_mentions/README.md | 6 +----- 1 file changed, 1 insertion(+), 5 deletions(-) diff --git a/examples/conversation_with_mentions/README.md b/examples/conversation_with_mentions/README.md index 2cb12ea52..858915710 100644 --- a/examples/conversation_with_mentions/README.md +++ b/examples/conversation_with_mentions/README.md @@ -33,8 +33,4 @@ You can adjust the behavior and parameters of the NPC agents and conversation mo ### Changing User Input Time Limit Adjust the `USER_TIME_TO_SPEAK` variable in the `main.py` script to change the time limit for user input. - -### Setting a Default Topic for the Chat Room - -Modify the `DEFAULT_TOPIC` variable in the `main.py` script to set the initial topic of the chat room. -### \ No newline at end of file +###