Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AzureOpenAI Missing Required Properties for BaseChatModel #6886

Open
5 tasks done
Nicolas705 opened this issue Sep 26, 2024 · 2 comments
Open
5 tasks done

AzureOpenAI Missing Required Properties for BaseChatModel #6886

Nicolas705 opened this issue Sep 26, 2024 · 2 comments
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@Nicolas705
Copy link

Nicolas705 commented Sep 26, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import { AzureOpenAI } from "@langchain/openai";
import { createToolCallingAgent } from "langchain/agents";

const llm = new AzureOpenAI({
model: "gpt-4o",
azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
temperature: 0,
// other params...
});

const agent = createToolCallingAgent({
llm,
tools,
prompt,
});

Error Message and Stack Trace (if applicable)

[{
"resource": "/Users/nicolasgertler/.../agent.ts",
"owner": "typescript",
"code": "2739",
"severity": 8,
"message": "Type 'AzureOpenAI' is missing the following properties from type 'BaseChatModel<BaseChatModelCallOptions, BaseMessageChunk>': getLsParams, callPrompt",
"source": "ts",
"startLineNumber": 741,
"startColumn": 9,
"endLineNumber": 741,
"endColumn": 12,
"relatedInformation": [
{
"startLineNumber": 16,
"startColumn": 5,
"endLineNumber": 16,
"endColumn": 8,
"message": "The expected type comes from property 'llm' which is declared here on type 'CreateToolCallingAgentParams'",
"resource": "/Users/nicolasgertler/.../node_modules/langchain/dist/agents/tool_calling/index.d.ts"
}
]
}]

Description

When using the AzureOpenAI class from the @langchain/openai package, a TypeScript error occurs indicating that the class is missing required properties (getLsParams and callPrompt) from the BaseChatModel interface. This error prevents the AzureOpenAI class from being used as an llm in the createToolCallingAgent function.

This error is a LangChain issue because the AzureOpenAI class provided by the @langchain/openai package does not fully implement the BaseChatModel interface required by the createToolCallingAgent function. The missing methods (getLsParams and callPrompt) are expected by LangChain's internal type definitions, indicating a gap in the library's implementation.

System Info

@langchain/community: ^0.2.16
2. @langchain/core: ^0.2.16
3. 6. @langchain/openai: ^0.2.6

@ccurme ccurme transferred this issue from langchain-ai/langchain Sep 26, 2024
@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Sep 26, 2024
Copy link

dosubot bot commented Sep 26, 2024

I found a similar open issue that might be relevant to your problem: Typing issue with langchain, @langchain/core, and @langchain/openai [1]. This issue discusses type mismatches and provides some potential solutions.

To resolve the TypeScript error indicating that the AzureOpenAI class is missing required properties (getLsParams and callPrompt) from the BaseChatModel interface, you need to ensure that the AzureOpenAI class implements these methods.

Here is an example of how you can extend the AzureOpenAI class to include the missing methods:

import { AzureOpenAI } from '@langchain/openai';
import { BasePromptValueInterface, BaseMessage, CallOptions, Callbacks } from '@langchain/core';

class CustomAzureOpenAI extends AzureOpenAI {
  getLsParams(options: this["ParsedCallOptions"]): LangSmithParams {
    return {
      ls_model_type: "chat",
      ls_stop: options.stop,
    };
  }

  async callPrompt(
    promptValue: BasePromptValueInterface,
    options?: string[] | CallOptions,
    callbacks?: Callbacks
  ): Promise<BaseMessage> {
    const promptMessages: BaseMessage[] = promptValue.toChatMessages();
    return this.call(promptMessages, options, callbacks);
  }
}

const llm = new CustomAzureOpenAI({
  model: "gpt-4o",
  azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
  azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
  azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
  azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
  temperature: 0,
  // other params...
});

const agent = createToolCallingAgent({
  llm,
  tools,
  prompt,
});

By extending the AzureOpenAI class and adding the getLsParams and callPrompt methods, you ensure that it conforms to the BaseChatModel interface, resolving the TypeScript error [2][3].

To continue talking to Dosu, mention @dosu.

@ElhamBadri2411
Copy link

Hey, I was wondering if I could take a look at this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants