-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AzureOpenAI Missing Required Properties for BaseChatModel #6886
Comments
I found a similar open issue that might be relevant to your problem: Typing issue with langchain, @langchain/core, and @langchain/openai [1]. This issue discusses type mismatches and provides some potential solutions. To resolve the TypeScript error indicating that the Here is an example of how you can extend the import { AzureOpenAI } from '@langchain/openai';
import { BasePromptValueInterface, BaseMessage, CallOptions, Callbacks } from '@langchain/core';
class CustomAzureOpenAI extends AzureOpenAI {
getLsParams(options: this["ParsedCallOptions"]): LangSmithParams {
return {
ls_model_type: "chat",
ls_stop: options.stop,
};
}
async callPrompt(
promptValue: BasePromptValueInterface,
options?: string[] | CallOptions,
callbacks?: Callbacks
): Promise<BaseMessage> {
const promptMessages: BaseMessage[] = promptValue.toChatMessages();
return this.call(promptMessages, options, callbacks);
}
}
const llm = new CustomAzureOpenAI({
model: "gpt-4o",
azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
temperature: 0,
// other params...
});
const agent = createToolCallingAgent({
llm,
tools,
prompt,
}); By extending the |
Hey, I was wondering if I could take a look at this issue |
Checked other resources
Example Code
import { AzureOpenAI } from "@langchain/openai";
import { createToolCallingAgent } from "langchain/agents";
const llm = new AzureOpenAI({
model: "gpt-4o",
azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
temperature: 0,
// other params...
});
const agent = createToolCallingAgent({
llm,
tools,
prompt,
});
Error Message and Stack Trace (if applicable)
[{
"resource": "/Users/nicolasgertler/.../agent.ts",
"owner": "typescript",
"code": "2739",
"severity": 8,
"message": "Type 'AzureOpenAI' is missing the following properties from type 'BaseChatModel<BaseChatModelCallOptions, BaseMessageChunk>': getLsParams, callPrompt",
"source": "ts",
"startLineNumber": 741,
"startColumn": 9,
"endLineNumber": 741,
"endColumn": 12,
"relatedInformation": [
{
"startLineNumber": 16,
"startColumn": 5,
"endLineNumber": 16,
"endColumn": 8,
"message": "The expected type comes from property 'llm' which is declared here on type 'CreateToolCallingAgentParams'",
"resource": "/Users/nicolasgertler/.../node_modules/langchain/dist/agents/tool_calling/index.d.ts"
}
]
}]
Description
When using the AzureOpenAI class from the @langchain/openai package, a TypeScript error occurs indicating that the class is missing required properties (getLsParams and callPrompt) from the BaseChatModel interface. This error prevents the AzureOpenAI class from being used as an llm in the createToolCallingAgent function.
This error is a LangChain issue because the AzureOpenAI class provided by the @langchain/openai package does not fully implement the BaseChatModel interface required by the createToolCallingAgent function. The missing methods (getLsParams and callPrompt) are expected by LangChain's internal type definitions, indicating a gap in the library's implementation.
System Info
@langchain/community: ^0.2.16
2. @langchain/core: ^0.2.16
3. 6. @langchain/openai: ^0.2.6
The text was updated successfully, but these errors were encountered: