Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Where to add config to streamEvents? #318

Open
guidorietbroek opened this issue Aug 15, 2024 · 6 comments
Open

Where to add config to streamEvents? #318

guidorietbroek opened this issue Aug 15, 2024 · 6 comments

Comments

@guidorietbroek
Copy link

Maybe a stupid question, but I can't find it in the docs.

If I want to add a thread_id to a final response of a graph (specific node), how to add this thread id as config?

let config = {
    configurable: {
      thread_id: "1",
    }
const eventStream = await graph.streamEvents(
  { messages: [["user", "What's the capital of Nepal?"]] },
  { version: "v2" },
  { includeNames: ["Summarizer"] }
);

And why do you need to use version v1 or v2?

@hinthornw
Copy link
Contributor

hinthornw commented Aug 15, 2024

In the second positional location (where version is passed)

Here's the ref doc for streamEvents. options extends RunnableConfig, which contains the configuration values you're looking for. It also contains info about the differences between the versions.

We version the endpoints explicitly since streamEvents returns all the sub-events. Streaming is critical for any application, and we plan to make some improvements to tracing in the future that might influence some of the characteristics of those events. Things that would influence these characteristics would be scoped to a new version.

@justinlevi
Copy link
Contributor

Perhaps somewhat related, but how would I limit only the last node to stream to a response client?

If I have several nodes and each node invokes a chain, currently I'm seeing on_chat_model_stream events for all chains and I only want to forward on chunks from the last generation node.

@guidorietbroek
Copy link
Author

Thanks @hinthornw

Can you just confirm this is how it is supposed to work?

const eventStream = await graph.streamEvents(
  { messages: [["user", "What's the capital of Nepal?"]] },
  { version: "v2", ...config },
  { includeNames: ["Summarizer"] }
);

@justinlevi see this js example for your answer: https://langchain-ai.github.io/langgraphjs/how-tos/stream-tokens/#other-graphs

@guidorietbroek
Copy link
Author

So the confusing was caused by the fact that I get an error when using Claude in my LangGraph in combination with streamEvents. When changing to GPT4o the streaming is working.

This is the error I get:

An error occurred: Error: No parseable tool calls provided to AnthropicToolsOutputParser.

I am using a Zod schema with the structuredoutput:

const securitySchema = z.object({
        securityCheck: z.enum(["safe", "unsafe"])
        .describe("Response if an incoming user input is safe to handle or not. "),
    });

    // const model = new ChatAnthropic({
    //     temperature: 0,
    //     apiKey: process.env.ANTHROPIC_API_KEY,
    //     model: "claude-3-haiku-20240307",
    // })

    const model = new ChatOpenAI({
        temperature: 0,
        apiKey: process.env.OPENAI_API_KEY,
        model: "gpt-4o-2024-05-13",
    })

const structuredModel = model.withStructuredOutput(securitySchema, {name: "securitySchema"});

Is this a bug?

@hwchase17
Copy link

do you have a langsmith trace to share?

@guidorietbroek
Copy link
Author

guidorietbroek commented Aug 16, 2024

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants