Replies: 1 comment 3 replies
-
I was running into the same issue and just figured out a working solution to this, though there are probably improvements that can be made. TRPC mutation .mutation(async function* ({ input }) {
try {
const result = await streamText({
model: openai("gpt-4o-mini"),
messages: convertToCoreMessages(
input.messages as Parameters<typeof convertToCoreMessages>[0],
),
});
for await (const text of result.fullStream) {
yield text;
}
} catch (error) {
console.log(error);
}
}), Since TRPC returns an iterator for the SSE batch events, on your route handler where you are parsing the TRPC response you can make it usable by the hook again by converting it to a ReadableStream and then utilizing the const result = await client.chat.mutate({
messages,
});
const stream = new ReadableStream({
async start(controller) {
for await (const text of result) {
if (text?.type === "finish") {
controller.close();
break;
} else if (text?.type === "text-delta") {
controller.enqueue(text.textDelta);
}
}
},
});
return LangChainAdapter.toDataStreamResponse(stream); Note that it is still required to have a separate handler since the You also can iterate over the |
Beta Was this translation helpful? Give feedback.
-
Is there any way to integrate tRPC API with hooks like
useChat
oruseCompletion
?Beta Was this translation helpful? Give feedback.
All reactions