Welcome to the developer guide for TigerGraph CoPilot. This guide provides information on how to add a new LangChain tool, embedding service, or LLM generation service to CoPilot.
- Adding a New LangChain Tool
- Adding a New Embedding Service
- Adding a New LLM Generation Service
- Adding New Tests
If you want your agent to connect to other data sources or be able to perform custom logic, you can add a new LangChain tool to TigerGraph CoPilot. To add a new LangChain tool, follow these steps:
- In the
app/tools
directory, create a new file for your tool. The file should be namedtoolname.py
wheretoolname
is the name of your tool. - Define your tool. The tool should a valid Python class that inherits from the LangChain
BaseTool
class. For more information refer to the LangChain documentation. - Add your tool to the
app/tools/__init__.py
file. This file should contain an import statement for your tool. For example:
from .generate_function import GenerateFunction
- Enable your tool to be used by the agent. To do this, import and instantiate your tool in the
app/agent.py
file. For example:
from tools import GenerateFunction
generate_function = GenerateFunction()
Then add the tool to the tools
list in the Agent
class. For example:
tools = [mq2s, gen_func, new_tool]
-
Test your tool. Run the service and test your tool to ensure that it works as expected.
-
(Optional): Think that your tool could be useful for others? Consider contributing it! To contribute your tool, submit a pull request to the TigerGraph CoPilot repository and checkout our contributing guidelines.
One might want to add a new embedding service to TigerGraph CoPilot to better fit their deployment environment. To do this, follow these steps:
- In
app/embeddings/embedding_service.py
and create a new class that inherits from theBaseEmbeddingService
class. For example:
class MyEmbeddingService(BaseEmbeddingService):
def __init__(self, config):
super().__init__(config)
# Add your custom initialization code here
- Implement the needed methods for your service. If you utilize a LangChain-supported embedding service, you can use the
BaseEmbeddingService
class as a reference. If you are using a custom endpoint, you will need to implement theembed_documents
andembed_query
methods accordingly. - Import your service and dd your service to the
app/main.py
file where theEmbeddingService
class is instantiated. For example:
from common.embeddings.embedding_service import MyembeddingService
if llm_config["embedding_service"]["embedding_model_service"].lower() == "MyEmbeddingService":
embedding_service = MyEmbeddingService(llm_config["embedding_service"])
- Test your service. Run the service and test your service to ensure that it works as expected.
- (Optional): Think that your service could be useful for others? Consider contributing it! To contribute your service, submit a pull request to the TigerGraph CoPilot repository and checkout our contributing guidelines.
To add a new LLM generation service to TigerGraph CoPilot, follow these steps:
- Create a new file in the
app/llm_services
directory. The file should be namedservice_name.py
whereservice_name
is the name of your service. - Define your service. The service should be a valid Python class that inherits from the
LLM_Model
class defined in theapp/llm_services/base_llm.py
file. - Add your service to the
app/llm_services/__init__.py
file. This file should contain an import statement for your service. For example:
from .service_name import ServiceName
- Import and instantiate your service in the
app/main.py
file. For example:
from common.llm_services import ServiceName
# Within the instantiation of the Agent class elif block
elif llm_config["completion_service"]["llm_service"].lower() == "my_service":
logger.debug(f"/{graphname}/query request_id={req_id_cv.get()} llm_service=my_service agent created")
agent = TigerGraphAgent(AzureOpenAI(llm_config["completion_service"]), conn, embedding_service, embedding_store)
- Test your service. Run the service and test your service to ensure that it works as expected.
- (Optional): Think that your service could be useful for others? Consider contributing it! To contribute your service, submit a pull request to the TigerGraph CoPilot repository and checkout our contributing guidelines.
To add a new InquiryAI test suite to TigerGraph CoPilot, follow these steps:
-
Download the InquiryAI test template from here in
.tsv
format. -
Create a new directory in the
tests/test_questions
directory. The directory should be namedsuite_name
wheresuite_name
is the name of your test suite. -
Add the
.tsv
file to the new directory, populated with your example questions and expected answers. -
(Optional): Add the necessary GSQL and setup script to the
tests/test_questions/suite_name
directory to support your test suite. The setup scripts are not run with the test suite, but help to set up the graph for the test suite. The tests assume that the graph is already set up. -
Add necessary query descriptors to the
tests/test_questions/suite_name
directory. Within a directory named after the query, add a.json
file with the query descriptor. Optionally add a.gsql
file with the query itself. -
Add the test suite to the
tests/test_questions/parse_test_config.py
file by adding an available schema to theschema
argument list. -
Test your test suite. Run the test suite and ensure that it works as expected. Run the tests with the following command (and add desired options described here):
./run_tests.sh
- (Optional): Think that your test suite could be useful for others? Consider contributing it! To contribute your test suite, submit a pull request to the TigerGraph CoPilot repository and checkout our contributing guidelines.