Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Agent Tools: ValueError: <function_name> is not strict. Only strict function tools can be auto-parsed #2386

Open
5 tasks done
jaimeescano opened this issue Nov 11, 2024 · 0 comments

Comments

@jaimeescano
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangGraph/LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangGraph/LangChain rather than my code.
  • I am sure this is better as an issue rather than a GitHub discussion, since this is a LangGraph bug and not a design question.

Example Code

############################ Graph flow/src/graph.py
from langgraph.prebuilt import ToolNode, tools_condition
from langgraph.store.memory import InMemoryStore
from langgraph.graph.message import add_messages
from langgraph.graph import StateGraph

from langchain_core.messages import SystemMessage, HumanMessage, ToolMessage, AIMessage
from langchain_openai import AzureChatOpenAI
from langchain_core.tools import tool

from typing_extensions import Annotated, TypedDict
import random
import os

############# Graph #############
class ConfigSchema(TypedDict):
    """Define the schema for the config object."""
    thread_id: str

class StateSchema(TypedDict):
    """Define the schema for the state object."""
    prompt: str
    messages: Annotated[list, add_messages] = []


############# Tools #############
@tool
def get_city_population(city:str) -> int:
    """Get the population of a city"""
    return random.randint(1000, 1000000)

tools = [get_city_population]
############# Nodes #############
class Agent():
    name: str = "Agent"
    agent: AzureChatOpenAI = None
    tools: list = []

    def __init__(self, tools: list = [], name: str = "Agent"):
        self.name = name

        self.agent = AzureChatOpenAI(
            azure_deployment=os.getenv('MODEL', 'gpt-4o-mini'),
            azure_endpoint=os.getenv('AZURE_OPENAI_ENDPOINT'),
            openai_api_key=os.getenv('AZURE_OPENAI_API_KEY'),
            openai_api_version = os.getenv("OPENAI_API_VERSION"),
            model_kwargs = {
                "response_format": {"type": "json_object"}
            }
        )

        # Bind tools to the agent if they are provided
        if len(tools) > 0:
            self.tools = tools

        # Bind tools to the agent if they are already set
        if len(self.tools) > 0:
            self.agent = self.agent.bind_tools(tools)
            # self.agent = self.agent.bind_tools(tools, strict = True)

        # Disabling this line as it seems not having any effect        
        # self.agent.bind(response_format={"type": "json_object"})

    def run(self, state: dict, config: dict) -> dict:
        existing_messages = state.get('messages', [])

        new_messages = []

        if not existing_messages:
            new_messages = [
                SystemMessage("Your task is to find the population of a city requested by the user. Your response should be a JSON object with the city name and the population. {'<city>':'<population>'}"),
                HumanMessage(state.get('prompt')),
            ]

        response = self.agent.invoke(existing_messages + new_messages)

        new_messages.append(response)

        return {
            "messages": new_messages
        }


graph_builder = StateGraph(StateSchema, ConfigSchema)

agent_node = Agent(tools=tools)
tools_node = ToolNode(tools=tools)

# Add the nodes to the graph

graph_builder.add_node(agent_node.name, agent_node.run)
graph_builder.add_node(tools_node.name, tools_node)

graph_builder.add_conditional_edges(
    agent_node.name,
    tools_condition
)
graph_builder.add_edge(tools_node.name, agent_node.name)

# Configure the graph
graph_builder.set_entry_point(agent_node.name)
graph_builder.set_finish_point(agent_node.name)

in_memory_store = InMemoryStore()
graph = graph_builder.compile(store = in_memory_store)




############################ FastAPI api.py
from fastapi import FastAPI, Body
from pydantic import BaseModel
from flow.src.graph import graph as graph_module

app = FastAPI()

class GraphInput(BaseModel):
    prompt: str

@app.post("/graph")
async def create_graph(input: GraphInput):
    return graph_module.invoke({"prompt": input.prompt})

Error Message and Stack Trace (if applicable)

ERROR:    Exception in ASGI application
 Traceback (most recent call last):
   File "/usr/local/lib/python3.11/site-packages/uvicorn-0.32.0-py3.11.egg/uvicorn/protocols/http/h11_impl.py", line 406, in run_asgi
     result = await app(  # type: ignore[func-returns-value]
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/uvicorn-0.32.0-py3.11.egg/uvicorn/middleware/proxy_headers.py", line 60, in __call__
     return await self.app(scope, receive, send)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/fastapi-0.115.4-py3.11.egg/fastapi/applications.py", line 1054, in __call__
     await super().__call__(scope, receive, send)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/applications.py", line 113, in __call__
     await self.middleware_stack(scope, receive, send)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/middleware/errors.py", line 187, in __call__
     raise exc
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/middleware/errors.py", line 165, in __call__
     await self.app(scope, receive, _send)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/middleware/exceptions.py", line 62, in __call__
     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/_exception_handler.py", line 53, in wrapped_app
     raise exc
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/_exception_handler.py", line 42, in wrapped_app
     await app(scope, receive, sender)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/routing.py", line 715, in __call__
     await self.middleware_stack(scope, receive, send)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/routing.py", line 735, in app
     await route.handle(scope, receive, send)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/routing.py", line 288, in handle
     await self.app(scope, receive, send)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/routing.py", line 76, in app
     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/_exception_handler.py", line 53, in wrapped_app
     raise exc
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/_exception_handler.py", line 42, in wrapped_app
     await app(scope, receive, sender)
   File "/usr/local/lib/python3.11/site-packages/starlette-0.41.2-py3.11.egg/starlette/routing.py", line 73, in app
     response = await f(request)
                ^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/fastapi-0.115.4-py3.11.egg/fastapi/routing.py", line 301, in app
     raw_response = await run_endpoint_function(
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/fastapi-0.115.4-py3.11.egg/fastapi/routing.py", line 212, in run_endpoint_function
     return await dependant.call(**values)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/app/api.py", line 14, in create_graph
     return graph_module.invoke({"prompt": input.prompt})
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/langgraph-0.2.45-py3.11.egg/langgraph/pregel/__init__.py", line 1608, in invoke
     for chunk in self.stream(
   File "/usr/local/lib/python3.11/site-packages/langgraph-0.2.45-py3.11.egg/langgraph/pregel/__init__.py", line 1336, in stream
     for _ in runner.tick(
   File "/usr/local/lib/python3.11/site-packages/langgraph-0.2.45-py3.11.egg/langgraph/pregel/runner.py", line 58, in tick
     run_with_retry(t, retry_policy)
   File "/usr/local/lib/python3.11/site-packages/langgraph-0.2.45-py3.11.egg/langgraph/pregel/retry.py", line 29, in run_with_retry
     task.proc.invoke(task.input, config)
   File "/usr/local/lib/python3.11/site-packages/langgraph-0.2.45-py3.11.egg/langgraph/utils/runnable.py", line 410, in invoke
     input = context.run(step.invoke, input, config, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/langgraph-0.2.45-py3.11.egg/langgraph/utils/runnable.py", line 184, in invoke
     ret = context.run(self.func, input, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/app/flow/src/graph.py", line 71, in run
     response = self.agent.invoke(messages)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/langchain_core-0.3.15-py3.11.egg/langchain_core/runnables/base.py", line 5354, in invoke
     return self.bound.invoke(
            ^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/langchain_core-0.3.15-py3.11.egg/langchain_core/language_models/chat_models.py", line 286, in invoke
     self.generate_prompt(
   File "/usr/local/lib/python3.11/site-packages/langchain_core-0.3.15-py3.11.egg/langchain_core/language_models/chat_models.py", line 786, in generate_prompt
     return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/langchain_core-0.3.15-py3.11.egg/langchain_core/language_models/chat_models.py", line 643, in generate
     raise e
   File "/usr/local/lib/python3.11/site-packages/langchain_core-0.3.15-py3.11.egg/langchain_core/language_models/chat_models.py", line 633, in generate
     self._generate_with_cache(
   File "/usr/local/lib/python3.11/site-packages/langchain_core-0.3.15-py3.11.egg/langchain_core/language_models/chat_models.py", line 851, in _generate_with_cache
     result = self._generate(
              ^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/langchain_openai-0.2.6-py3.11.egg/langchain_openai/chat_models/base.py", line 701, in _generate
     response = self.root_client.beta.chat.completions.parse(**payload)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.11/site-packages/openai-1.54.3-py3.11.egg/openai/resources/beta/chat/completions.py", line 142, in parse
     _validate_input_tools(tools)
   File "/usr/local/lib/python3.11/site-packages/openai-1.54.3-py3.11.egg/openai/lib/_parsing/_completions.py", line 53, in validate_input_tools
     raise ValueError(
 ValueError: `get_city_population` is not strict. Only `strict` function tools can be auto-parsed

Description

We are experiencing an issue with the AzureChatOpenAI binding tools into it. The odd behaviour is that the graph works in LangGraph Studio .. but onces it's triggered via FastAPI .. an error is thrown indicating "Only strict function tools can be auto-parsed".

The issue seems to be related to the response_format. When declaring the LLM using AzureChatOpenAI, we are passing "response_format": {"type": "json_object"}. We have tried using the option agent.bind(response_format={"type": "json_object"}) but it doesn't seems to have any effect, if we remove "JSON" word from the system prompt ... it won't give any error ... whereas it should respond "'messages' must contain the word 'json' in some form, to use 'response_format' of type 'json_object'.". So utilizing the "model_kwargs" within AzureChatOpenAI did activate the "response_format".

Troubleshooting 1:

  1. Set model_kwargs = {"response_format": {"type": "json_object"}}
  2. API call to the fastapi endpoint: POST /graph {"prompt":"Population of London?"}
  3. Get ERROR, Only strict function tools can be auto-parsed

Troubleshooting 2:

  1. Set model_kwargs = {"response_format": {"type": "json_object"}}
  2. Set agent.bind_tools(tools, strict = True)
  3. API call to the fastapi endpoint: POST /graph {"prompt":"Population of London?"}
  4. Get ERROR, langgraph.errors.GraphRecursionError: Recursion limit of 25 reached without hitting a stop condition

Troubleshooting 3:

  1. Setting agent.bind(response_format={"type": "json_object"})
  2. API call to the fastapi endpoint: POST /graph {"prompt":"Population of London?"}
  3. WORKED .. but looks like response_format is not activated.
  4. Removed "JSON" keyword from the system prompt
  5. API call to the fastapi endpoint: POST /graph {"prompt":"Population of London?"}
  6. WORKED, didn't trigger any error .. so json_object feature not "used". Which is not a valid option.

Troubleshooting 4:

  1. Set model_kwargs = {"response_format": {"type": "json_object"}}
  2. Removed "JSON" keyword from the system prompt
  3. API call to the fastapi endpoint: POST /graph {"prompt":"Population of London?"}
  4. Get ERROR (which is expected): "'messages' must contain the word 'json' in some form, to use 'response_format' of type 'json_object'.". Which means ... the "feature" about reponse_format is activated.

And finally, running the initial version using model_kwargs = {"response_format": {"type": "json_object"}} and just using
self.agent = agent.bind_tools(tools) WORKS perfect if the graph if triggered using LangGraph Studio.

Within the stack error, it shows 'openai/resources/beta/chat/completions.py' ... so not sure if this could be related?

Thanks for your help and support ;-)

System Info

python -m langchain_core.sys_info

System Information

OS: Linux
OS Version: #1 SMP Mon Aug 12 08:47:01 UTC 2024
Python Version: 3.11.10 (main, Oct 19 2024, 18:56:55) [GCC 12.2.0]

Package Information

langchain_core: 0.3.15
langchain: 0.3.7
langchain_community: 0.3.5
langsmith: 0.1.142
langchain_openai: 0.2.6
langchain_postgres: 0.0.12
langchain_text_splitters: 0.3.2
langgraph: 0.2.45

Optional packages not installed

langserve

Other Dependencies

aiohttp: 3.11.0rc1
async-timeout: Installed. No version info available.
dataclasses-json: 0.6.7
httpx: 0.27.2
httpx-sse: 0.4.0
jsonpatch: 1.33
langgraph-checkpoint: 2.0.2
langgraph-sdk: 0.1.35
numpy: 1.26.4
openai: 1.54.3
orjson: 3.10.11
packaging: 24.1
pgvector: 0.2.5
psycopg: 3.2.3
psycopg-pool: 3.2.3
pydantic: 2.10.0b1
pydantic-settings: 2.6.1
PyYAML: 6.0.2
requests: 2.32.3
requests-toolbelt: 1.0.0
sqlalchemy: 2.0.35
SQLAlchemy: 2.0.35
tenacity: 9.0.0
tiktoken: 0.8.0
typing-extensions: 4.12.2

@jaimeescano jaimeescano changed the title Agent Tools: ValueError: my_function is not strict. Only strict function tools can be auto-parsed Agent Tools: ValueError: <function_name> is not strict. Only strict function tools can be auto-parsed Nov 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant