Skip to content

Commit

Permalink
Add an example for the compatibility with langchain (#62)
Browse files Browse the repository at this point in the history
  • Loading branch information
DavdGao authored Mar 18, 2024
1 parent 58d5100 commit f8301a5
Show file tree
Hide file tree
Showing 2 changed files with 112 additions and 0 deletions.
27 changes: 27 additions & 0 deletions examples/conversation_with_langchain/READMD.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# Create an Agent with LangChain

AgentScope is a highly flexible multi-agent platform. It allows developers
to create agents with third-party libraries.

In this example, we will show how to create an assistant agent with
LangChain in AgentScope, and interact with user in a conversation.

**Note** we use OpenAI API for LangChain in this example. Developers can
modify it according to their own needs.

## Install LangChain

Before running the example, please install LangChain by the following command:
```bash
pip install langchain==0.1.11 langchain-openai==0.0.8
```

## Create Agent with LangChain

In this example, the memory management, prompt engineering, and model
invocation are all handled by LangChain.
Specifically, we create an agent class named `LangChainAgent`.
In its `reply` function, developers only need parse the input message and
wrap the output message into `agentscope.message.Msg` class.
After that, developers can build the conversation in AgentScope, and the
`LangChainAgent` is the same as other agents in AgentScope.
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
# -*- coding: utf-8 -*-
"""A simple example of using langchain to create an assistant agent in
AgentScope."""
import os
from typing import Optional

from langchain_openai import OpenAI
from langchain.memory import ConversationBufferMemory
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain

import agentscope
from agentscope.agents import AgentBase
from agentscope.agents import UserAgent
from agentscope.message import Msg


class LangChainAgent(AgentBase):
"""An agent that implemented by langchain."""

def __init__(self, name: str) -> None:
"""Initialize the agent."""

# Disable AgentScope memory and use langchain memory instead
super().__init__(name, use_memory=False)

# [START] BY LANGCHAIN
# Create a memory in langchain
memory = ConversationBufferMemory(memory_key="chat_history")

# Prepare prompt
template = """
You are a helpful assistant, and your goal is to help the user.
{chat_history}
Human: {human_input}
Assistant:"""

prompt = PromptTemplate(
input_variables=["chat_history", "human_input"],
template=template,
)

llm = OpenAI(openai_api_key=os.environ["OPENAI_API_KEY"])

# Prepare a chain and manage the memory by LLMChain in langchain
self.llm_chain = LLMChain(
llm=llm,
prompt=prompt,
verbose=False,
memory=memory,
)
# [END] BY LANGCHAIN

def reply(self, x: Optional[dict] = None) -> Msg:
# [START] BY LANGCHAIN

# Generate response
response_str = self.llm_chain.predict(human_input=x.content)

# [END] BY LANGCHAIN

# Wrap the response in a message object in AgentScope
return Msg(name=self.name, content=response_str)


# Build a conversation between user and assistant agent

# init AgentScope
agentscope.init()

# Create an instance of the langchain agent
agent = LangChainAgent(name="Assistant")

# Create a user agent from AgentScope
user = UserAgent("User")

msg = None
while True:
# User input
msg = user(msg)
if msg.content == "exit":
break
# Agent speaks
msg = agent(msg)

0 comments on commit f8301a5

Please sign in to comment.