Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

langchain agent -> Exiting LLM run with output: { "generations": [ [ null ] ] } #6802

Open
5 tasks done
bnn16 opened this issue Sep 15, 2024 · 9 comments
Open
5 tasks done
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@bnn16
Copy link

bnn16 commented Sep 15, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

export const GPTmodel = (model, temp) => {
  return new ChatOpenAI({
        model,
        temperature,
        configuration: {
            baseURL: LLM_GATEWAY_BASE_URL,
            dangerouslyAllowBrowser: true,
            defaultHeaders: {
              blabla
            },
        },
        azureOpenAIBasePath: `ourpath`,
        azureOpenAIApiDeploymentName: model,
        azureOpenAIApiVersion: '2024-02-01',
        azureOpenAIApiKey: 'fake-key-we-use-auth',

}


export const ProcessMiningTutor = async () => {
    const llm = GPTmodel(GPTModel.GPT4, 0);

    const promptTemplate = SystemMessagePromptTemplate.fromTemplate(`
    answer the user question in a funny and rap like way.
    The user question is: {input}
    also what is the value of foo?
    `);

    const prompt = ChatPromptTemplate.fromMessages([
        [ 'system', 'You are a helpful assistant.' ],
        promptTemplate,
        new MessagesPlaceholder('agent_scratchpad'),
    ]);
    const tools = [
        new DynamicTool({
            name: 'FOO',
            description: 'call this to get the value of foo.',
            // eslint-disable-next-line @typescript-eslint/require-await
            func: async () => 'baz',
        }),
    ];

    const agent = await createOpenAIFunctionsAgent({
        llm,
        prompt,
        tools,
    });

    return new AgentExecutor({
        agent,
        tools,
        verbose: true,
    }).withConfig({ runName: 'Agent' });
};



then when I call it 


 const {
        isLoading, error,
    } = useQuery(
        {
            // eslint-disable-next-line @stylistic/max-len
            queryKey: [ 'agentResponse', currentQuestion?.id, currentQuestion, currentQuestion?.agent, currentQuestion?.question, askedQuestions ],
            queryFn: async () => {
                if (!currentQuestion) {
                    return;
                }
                console.log(currentQuestion);
                // I have multiple agents so just think I'm giving the agent from above
                const agentName = currentQuestion.agent;
                const agentCreator = agentRegistry[agentName];
                const executor = (await agentCreator()) as AgentExecutor;

               // this thing ALWAYS fails. 
                const result = executor.streamEvents({ input: currentQuestion.question }, { version: 'v1' });

                for await (const event of result) {
                    const eventType = event.event;
                    if (eventType === 'on_chain_start') {
                        // Was assigned when creating the agent with `.withConfig({"runName": "Agent"})` above
                        if (event.name === 'Agent') {
                            console.log('\n-----');
                            console.log(
                                `Starting agent: ${event.name} with input: ${JSON.stringify(
                                    event.data.input,
                                )}`,
                            );
                        }
                    } else if (eventType === 'on_chain_end') {
                        // Was assigned when creating the agent with `.withConfig({"runName": "Agent"})` above
                        if (event.name === 'Agent') {
                            console.log('\n-----');
                            console.log(`Finished agent: ${event.name}\n`);
                            console.log(`Agent output was: ${event.data.output}`);
                            console.log('\n-----');
                        }
                    } else if (eventType === 'on_llm_stream') {
                        const content = event.data?.chunk?.message?.content;
                        // Empty content in the context of OpenAI means
                        // that the model is asking for a tool to be invoked via function call.
                        // So we only print non-empty content
                        if (content !== undefined && content !== '') {
                            console.log(`| ${content}`);
                        }
                    } else if (eventType === 'on_tool_start') {
                        console.log('\n-----');
                        console.log(
                            `Starting tool: ${event.name} with inputs: ${event.data.input}`,
                        );
                    } else if (eventType === 'on_tool_end') {
                        console.log('\n-----');
                        console.log(`Finished tool: ${event.name}\n`);
                        console.log(`Tool output was: ${event.data.output}`);
                        console.log('\n-----');
                    }
                }

Error Message and Stack Trace (if applicable)

@tanstack_react-query.js?v=77745d11:2942 Uncaught TypeError: Cannot read properties of undefined (reading 'content')
    at OpenAIFunctionsAgentOutputParser._baseMessageToString (chunk-NSISG4FD.js?v=77745d11:32:27)
    at OpenAIFunctionsAgentOutputParser._callWithConfig (chunk-NSISG4FD.js?v=77745d11:54:22)
    at OpenAIFunctionsAgentOutputParser._callWithConfig (chunk-2YXJZ3EI.js?v=77745d11:12447:28)
    at async OpenAIFunctionsAgentOutputParser._streamIterator (chunk-2YXJZ3EI.js?v=77745d11:12392:5)
    at async OpenAIFunctionsAgentOutputParser.transform (chunk-2YXJZ3EI.js?v=77745d11:12612:5)
    at async _RunnableSequence._streamIterator (chunk-2YXJZ3EI.js?v=77745d11:13336:24)

then after using verbose: true on the agent ->

 Entering LLM run with input: {
  "messages": [
    [
      {
        "lc": 1,
        "type": "constructor",
        "id": [
          "langchain_core",
          "messages",
          "SystemMessage"
        ],
        "kwargs": {
          "content": "You are a helpful assistant.",
          "additional_kwargs": {},
          "response_metadata": {}
        }
      },
      {
        "lc": 1,
        "type": "constructor",
        "id": [
          "langchain_core",
          "messages",
          "SystemMessage"
        ],
        "kwargs": {
          "content": "\n    answer the user question in a funny and rap like way.
    The user question is: what the dawg doing?
    also what is the value of foo?  ",
          "additional_kwargs": {},
          "response_metadata": {}
        }
      }
    ]
  ]
}
chunk-2YXJZ3EI.js?v=77745d11:5262 [llm/end] [1:chain:Agent > 2:chain:OpenAIFunctionsAgent > 7:llm:ChatOpenAI] [1.28s] Exiting LLM run with output: {
  "generations": [
    [
      null
    ]
  ]
}
chunk-2YXJZ3EI.js?v=77745d11:5225 [chain/start] [1:chain:Agent > 2:chain:OpenAIFunctionsAgent > 8:parser:OpenAIFunctionsAgentOutputParser] Entering Chain run with input: {}
chunk-2YXJZ3EI.js?v=77745d11:5243 [chain/error] [1:chain:Agent > 2:chain:OpenAIFunctionsAgent > 8:parser:OpenAIFunctionsAgentOutputParser] [0ms] Chain run errored with error: "Cannot read properties of undefined (reading 'content')\n\nTypeError: Cannot read properties of undefined (reading 'content')\n    at OpenAIFunctionsAgentOutputParser._baseMessageToString (http://localhost:9000/node_modules/.vite/deps/chunk-NSISG4FD.js?v=77745d11:32:27)\n    at OpenAIFunctionsAgentOutputParser._callWithConfig (http://localhost:9000/node_modules/.vite/deps/chunk-NSISG4FD.js?v=77745d11:54:22)\n    at OpenAIFunctionsAgentOutputParser._callWithConfig (http://localhost:9000/node_modules/.vite/deps/chunk-2YXJZ3EI.js?v=77745d11:12447:28)\n    at async OpenAIFunctionsAgentOutputParser._streamIterator (http://localhost:9000/node_modules/.vite/deps/chunk-2YXJZ3EI.js?v=77745d11:12392:5)\n    at async OpenAIFunctionsAgentOutputParser.transform (http://localhost:9000/node_modules/.vite/deps/chunk-2YXJZ3EI.js?v=77745d11:12612:5)\n    at async _RunnableSequence._streamIterator (http://localhost:9000/node_modules/.vite/deps/chunk-2YXJZ3EI.js?v=77745d11:13336:24)"

Description

I'm trying to use langchain to create an agent with openai functions/tools.
My application is living fully in the front-end. React.

In my network tab I can see the responce of the llm is SUCCESSFUL I CAN SEE THE CONTENT but the parser fails

But I CAN'T get langchain agents to work, since I'm using it on the front-end and the library is made for the back-end I suppose there are some kind of timeouts going under the hood that don't await the completion of a agent/tool and it just throws error ( inputVariables becomes always undefined :trollface: )

( after some more reasearch I foudn this guy has the same issue but in angular Seems like its a known problem for langchain when doing it in front-end https://stackoverflow.com/questions/77075029/langchain-simple-llm-predict-in-angular-returns-empty-result )

BUT IF I do it in a nodeJS environment it works as expected

But I can see the network tab

Screenshot 2024-09-15 at 07 26 12

that the responce is there -_-

P.S. I know I made this and a post in discussions, sorry for the spam, I'm on a tight deadline and need to get this working as soon as possible

System Info

"@langchain/core": "^0.2.31",
"@langchain/openai": "^0.2.10",
"langchain": "^0.3.0",

platform macos
node version -> Node.js v20.12.2.
npm version 10.8.2

Copy link

dosubot bot commented Sep 15, 2024

I found a similar issue that might be relevant to your problem: "Getting started code for chatModel throws error", which was resolved by updating zone.js to version 0.14.3 [1].

To address the TypeError: Cannot read properties of undefined (reading 'content') in your front-end React application, ensure that the prompt object returned by the pull function is not undefined. This error typically occurs when the prompt is not correctly fetched or is missing.

Here is a modified version of your code with added checks to handle the undefined case:

import { OpenAI } from "@langchain/openai";
import { TavilySearchResults } from "@langchain/community/tools/tavily_search";
import type { PromptTemplate } from "@langchain/core/prompts";

import { pull } from "langchain/hub";
import { AgentExecutor, createOpenAIFunctionsAgent } from "langchain/agents";

// Define the tools the agent will have access to.
const tools = [new TavilySearchResults({ maxResults: 1 })];

const llm = new OpenAI({
  model: "gpt-3.5-turbo-instruct",
  temperature: 0,
});

async function initializeAgent() {
  // Get the prompt to use - you can modify this!
  const prompt = await pull<PromptTemplate>("hwchase17/react");

  if (!prompt) {
    throw new Error("Failed to fetch the prompt template.");
  }

  const agent = await createOpenAIFunctionsAgent({
    llm,
    prompt,
    tools,
  });

  const agentExecutor = new AgentExecutor({
    agent,
    tools,
  });

  const result = await agentExecutor.invoke({
    input: "what is LangChain?",
  });

  console.log(result);

  const promptWithChat = await pull<PromptTemplate>("hwchase17/react-chat");

  if (!promptWithChat) {
    throw new Error("Failed to fetch the chat prompt template.");
  }

  const agentWithChat = await createOpenAIFunctionsAgent({
    llm,
    tools,
    prompt: promptWithChat,
  });

  const agentExecutorWithChat = new AgentExecutor({
    agent: agentWithChat,
    tools,
  });

  const result2 = await agentExecutorWithChat.invoke({
    input: "what's my name?",
    chat_history: "Human: Hi! My name is Cob\nAI: Hello Cob! Nice to meet you",
  });

  console.log(result2);
}

initializeAgent().catch(console.error);

This code includes checks to ensure that the prompt and promptWithChat are not undefined before proceeding. If they are undefined, an error is thrown, which can help in debugging the issue. Additionally, ensure that the agent type is correctly specified when initializing the agent executor, as shown in the LangChain.js documentation [2][3].

Furthermore, if you are using OpenAPI functions, ensure that the OpenAPI specification is correctly parsed and converted to OpenAI functions. The convertOpenAPISpecToOpenAIFunctions function in the LangChain.js library handles this conversion. Make sure that the OpenAPI spec is correctly formatted and accessible. If the content property is missing or incorrectly defined in the OpenAPI spec, it could lead to the TypeError [4].

To continue talking to Dosu, mention @dosu.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Sep 15, 2024
@bnn16
Copy link
Author

bnn16 commented Sep 15, 2024

 Uncaught TypeError: Cannot read properties of undefined (reading 'content')
    at OpenAIFunctionsAgentOutputParser._baseMessageToString (chunk-NSISG4FD.js?v=77745d11:32:27)
    at OpenAIFunctionsAgentOutputParser._callWithConfig (chunk-NSISG4FD.js?v=77745d11:54:22)
    at OpenAIFunctionsAgentOutputParser._callWithConfig (chunk-2YXJZ3EI.js?v=77745d11:12447:28)
    at async OpenAIFunctionsAgentOutputParser._streamIterator (chunk-2YXJZ3EI.js?v=77745d11:12392:5)
    at async OpenAIFunctionsAgentOutputParser.transform (chunk-2YXJZ3EI.js?v=77745d11:12612:5)
    at async _RunnableSequence._streamIterator (chunk-2YXJZ3EI.js?v=77745d11:13336:24)

but my issue is different @dosu
I get
Uncaught TypeError: Cannot read properties of undefined (reading 'content')
at OpenAIFunctionsAgentOutputParser._baseMessageToString (chunk-NSISG4FD.js?v=77745d11:32:27)
at OpenAIFunctionsAgentOutputParser._callWithConfig (chunk-NSISG4FD.js?v=77745d11:54:22)
at OpenAIFunctionsAgentOutputParser._callWithConfig (chunk-2YXJZ3EI.js?v=77745d11:12447:28)
at async OpenAIFunctionsAgentOutputParser._streamIterator (chunk-2YXJZ3EI.js?v=77745d11:12392:5)
at async OpenAIFunctionsAgentOutputParser.transform (chunk-2YXJZ3EI.js?v=77745d11:12612:5)
at async _RunnableSequence._streamIterator (chunk-2YXJZ3EI.js?v=77745d11:13336:24)
I don't even get a response from langchain... ( but I can see in my network thab that the ai responded )

Copy link

dosubot bot commented Sep 15, 2024

To resolve the Uncaught TypeError: Cannot read properties of undefined (reading 'content') error in OpenAIFunctionsAgentOutputParser, ensure that the message object being passed to the parseAIMessage method has a defined content property. This error occurs because the content property is being accessed on an undefined message object.

Here is the relevant part of the code that needs to be checked:

parseAIMessage(message: BaseMessage): ToolsAgentAction[] | AgentFinish {
  if (message.content && typeof message.content !== "string") {
    throw new Error("This agent cannot parse non-string model responses.");
  }
  if (message.additional_kwargs.tool_calls) {
    const toolCalls: OpenAIClient.Chat.ChatCompletionMessageToolCall[] =
      message.additional_kwargs.tool_calls;
    try {
      return toolCalls.map((toolCall, i) => {
        const toolInput = toolCall.function.arguments
          ? JSON.parse(toolCall.function.arguments)
          : {};
        const messageLog = i === 0 ? [message] : [];
        return {
          tool: toolCall.function.name as string,
          toolInput,
          toolCallId: toolCall.id,
          log: `Invoking "${toolCall.function.name}" with ${
            toolCall.function.arguments ?? "{}"
          }\n${message.content}`,
          messageLog,
        };
      });
    } catch (error) {
      throw new OutputParserException(
        `Failed to parse tool arguments from chat model response. Text: "${JSON.stringify(
          toolCalls
        )}". ${error}`
      );
    }
  } else {
    return {
      returnValues: { output: message.content },
      log: message.content,
    };
  }
}

Ensure that the message object passed to parseAIMessage is properly defined and has a content property. You can add a check to handle cases where message might be undefined:

parseAIMessage(message: BaseMessage): ToolsAgentAction[] | AgentFinish {
  if (!message || !message.content) {
    throw new Error("Message or message content is undefined.");
  }
  if (typeof message.content !== "string") {
    throw new Error("This agent cannot parse non-string model responses.");
  }
  if (message.additional_kwargs.tool_calls) {
    const toolCalls: OpenAIClient.Chat.ChatCompletionMessageToolCall[] =
      message.additional_kwargs.tool_calls;
    try {
      return toolCalls.map((toolCall, i) => {
        const toolInput = toolCall.function.arguments
          ? JSON.parse(toolCall.function.arguments)
          : {};
        const messageLog = i === 0 ? [message] : [];
        return {
          tool: toolCall.function.name as string,
          toolInput,
          toolCallId: toolCall.id,
          log: `Invoking "${toolCall.function.name}" with ${
            toolCall.function.arguments ?? "{}"
          }\n${message.content}`,
          messageLog,
        };
      });
    } catch (error) {
      throw new OutputParserException(
        `Failed to parse tool arguments from chat model response. Text: "${JSON.stringify(
          toolCalls
        )}". ${error}`
      );
    }
  } else {
    return {
      returnValues: { output: message.content },
      log: message.content,
    };
  }
}

This additional check ensures that the message object and its content property are defined before attempting to access them, thus preventing the Uncaught TypeError [1][2].

To continue talking to Dosu, mention @dosu.

@bnn16
Copy link
Author

bnn16 commented Sep 16, 2024

I managed to narrow it down to the issue being with vite. I tried making the exact same thing as -> https://js.langchain.com/v0.1/docs/modules/agents/how_to/custom_agent/

in a normal create-react-app it worked. But create-react-app is not using vite.But in my repo regardless of the config it wont work. @dosu

Copy link

dosubot bot commented Sep 16, 2024

The issue you're encountering with Vite might be related to how Vite handles module resolution and asynchronous operations compared to Create React App. Here are some steps and configurations to ensure compatibility with LangChain.js in a Vite project:

  1. Dependencies:
    Ensure you have the necessary dependencies in your package.json:

    {
      "dependencies": {
        "@langchain/anthropic": "workspace:*",
        "@langchain/community": "workspace:*",
        "@langchain/core": "workspace:*",
        "@langchain/openai": "workspace:*",
        "langchain": "workspace:*",
        "typescript": "^5.0.0",
        "vite": "^4.2.0"
      }
    }
  2. Scripts:
    Add the following scripts to your package.json:

    {
      "scripts": {
        "dev": "vite",
        "build": "vite build",
        "preview": "vite preview",
        "test": "tsc"
      }
    }
  3. Vite Configuration:
    Ensure your Vite configuration (vite.config.ts or vite.config.js) is set up correctly. Here is a basic example:

    import { defineConfig } from 'vite';
    import react from '@vitejs/plugin-react';
    
    export default defineConfig({
      plugins: [react()],
      resolve: {
        alias: {
          '@': '/src',
        },
      },
    });
  4. Handling Asynchronous Operations:
    LangChain.js uses the AsyncCaller class to handle asynchronous operations and timeouts. Ensure that your code properly handles these operations. Here is an example of how to use AsyncCaller:

    import { AsyncCaller } from "@langchain/core/utils/async_caller";
    
    const caller = new AsyncCaller({});
    const callable = () =>
      fetch("https://langchain.com/?sleep=1000", {
        signal: AbortSignal.timeout(10),
      });
    
    await expect(() => caller.call(callable)).rejects.toThrowError(
      "TimeoutError: The operation was aborted due to timeout"
    );
  5. Module Resolution:
    LangChain.js handles module resolution using the resolve and dirname functions from the node:path module and the fileURLToPath function from the node:url module. Ensure your Vite project is configured to handle these correctly.

    import { resolve, dirname } from "node:path";
    import { fileURLToPath } from "node:url";
    
    function abs(relativePath: string): string {
      return resolve(dirname(fileURLToPath(import.meta.url)), relativePath);
    }
    
    export const config = {
      internals: [/node\:/, /@langchain\/core\//],
      entrypoints: {
        index: "index",
      },
      tsConfigPath: resolve("./tsconfig.json"),
      cjsSource: "./dist-cjs",
      cjsDestination: "./dist",
      abs,
    };

By ensuring these configurations and handling asynchronous operations correctly, you should be able to resolve the issues with Vite and LangChain.js compatibility [1][2].

To continue talking to Dosu, mention @dosu.

@bnn16
Copy link
Author

bnn16 commented Sep 16, 2024

so what I'm getting is I need to add a polyfill to my vite config in order for it to work ? @dosu

Copy link

dosubot bot commented Sep 16, 2024

Yes, adding polyfills to your Vite configuration can help resolve compatibility issues. Here’s how you can add polyfills for Node.js modules in your Vite project:

  1. Install Polyfills:
    First, install the necessary polyfills:

    npm install --save-dev node-polyfill-webpack-plugin
  2. Update Vite Configuration:
    Modify your vite.config.ts or vite.config.js to include the polyfills:

    import { defineConfig } from 'vite';
    import react from '@vitejs/plugin-react';
    import NodePolyfillPlugin from 'node-polyfill-webpack-plugin';
    
    export default defineConfig({
      plugins: [react()],
      resolve: {
        alias: {
          '@': '/src',
        },
      },
      build: {
        rollupOptions: {
          plugins: [
            NodePolyfillPlugin({
              excludeAliases: ['console']
            })
          ]
        }
      }
    });
  3. Add Polyfills in Your Code:
    Ensure you import the necessary polyfills in your entry file (e.g., main.ts or index.ts):

    import { Buffer } from 'buffer';
    import process from 'process';
    
    window.Buffer = Buffer;
    window.process = process;

By adding these polyfills, you should be able to resolve the issues related to module resolution and asynchronous operations in your Vite project.

To continue talking to Dosu, mention @dosu.

@bnn16
Copy link
Author

bnn16 commented Sep 16, 2024

@dosu what about vite polyfill?

Copy link

dosubot bot commented Sep 16, 2024

To ensure LangChain.js works correctly in a Vite environment, you need to add polyfills for certain Node.js functionalities. Here’s how you can do it:

  1. Install Necessary Polyfills:
    Install the required polyfills:

    npm install --save-dev node-polyfill-webpack-plugin web-streams-polyfill core-js
  2. Update Vite Configuration:
    Modify your vite.config.ts or vite.config.js to include the polyfills:

    import { defineConfig } from 'vite';
    import react from '@vitejs/plugin-react';
    import NodePolyfillPlugin from 'node-polyfill-webpack-plugin';
    
    export default defineConfig({
      plugins: [react()],
      resolve: {
        alias: {
          '@': '/src',
        },
      },
      build: {
        rollupOptions: {
          plugins: [
            NodePolyfillPlugin({
              excludeAliases: ['console']
            })
          ]
        }
      }
    });
  3. Add Polyfills in Your Code:
    Ensure you import the necessary polyfills in your entry file (e.g., main.ts or index.ts):

    import 'web-streams-polyfill/es6';
    import 'core-js/stable/structured-clone';
    
    if (!globalThis.fetch) {
      globalThis.fetch = require('node-fetch');
    }

These steps will help you polyfill the required Node.js functionalities (fetch, ReadableStream, and structuredClone) in your Vite project, ensuring compatibility with LangChain.js [1][2].

To continue talking to Dosu, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

1 participant