You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched the LangChain.js documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
I am sure that this is a bug in LangChain.js rather than my code.
The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
// This is not easily reproductible because it has to be included inside a VS Code extension to fails...
// I don't know enought about creating VS Code extensions to create a minimal working example
// Although this is the implementation I use to generate a completion
// Retrieve proxy settings
const httpConfig = vscode.workspace.getConfiguration('http');
const proxy = httpConfig.get('proxy');
logger.info('Retrieved proxy settings:', proxy);
if (proxy) {
logger.info('Proxy detected. Configuring Axios to use the proxy.');
const agent = new HttpsProxyAgent(proxy);
axios.defaults.proxy = false; // Disable default Axios proxy behavior
axios.defaults.httpsAgent = agent;
logger.info('Axios proxy configuration complete.');
} else {
logger.info('No proxy detected. Proceeding without proxy configuration.');
}
try {
model = new ChatOpenAI({
modelName: 'gpt-4o-mini', // same issue with gpt-4o, same issue also with GroqChat
maxRetries: 1,
});
try {
const response = await model.invoke(
[
[
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
],
[
"human",
"I love programming.",
],
]
);
logger.info(`LLM response: ${response}`);
} catch (error) {
logger.error(`Error message: ${error.message}`);
logger.error(`Error stack trace: ${error.stack}`);
}
} catch (error) {
logger.info(`chatgpt.model: ${provider.modelManager.model} response: ${error}`);
throw error;
}
Error Message and Stack Trace (if applicable)
2024-09-15T09:55:11.354Z - INFO Retrieved proxy settings:
2024-09-15T09:55:11.354Z - INFO No proxy detected. Proceeding without proxy configuration.
2024-09-15T09:55:13.101Z - ERROR Error message: Connection error.
2024-09-15T09:55:13.101Z - ERROR Error stack trace: Error: Connection error.
at OpenAI.makeRequest (/home/jean/git/chatgpt-copilot/out/Extension.js:56648:13)
at async /home/jean/git/chatgpt-copilot/out/Extension.js:71714:21
at async RetryOperation._fn (/home/jean/git/chatgpt-copilot/out/Extension.js:22167:19)
Description
I'm trying to use langchain within a VS Code extension I'm working on. I'm able to get a valid response from OpenAI when using library @AI-SDK, but not when using langchain. I tried several stuff: minimal example ,streaming, non-streaming. I checked many times that my env variables was correctly setup, tried passing arguments "model" or "modelName" + apiKey, nothing works. When I send the request, it hangs for sometime and then I get a response error.
When I use OpenAI from as ai-sdk I don't have the issue.
When I use langchain outside my VS Code Extension I don't have the issue too.
Also the call to OpenAI is successfully logged into my Langsmith account, and I can run the request manually successfully from there, so I'm wondering if its not an issue with a proxy or I don't know.
System Info
"yarn info langchain": yarn_info.txt
platform (windows / linux / mac): WSL2 on windows
Node version: v18.20.4
yarn version: v1.22.22
The text was updated successfully, but these errors were encountered:
These solutions should help you address the connection error when using LangChain within a VS Code extension. If the issue persists, you might want to ensure that your proxy settings are correctly configured and that there are no network restrictions affecting the connection.
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
2024-09-15T09:55:11.354Z - INFO Retrieved proxy settings:
2024-09-15T09:55:11.354Z - INFO No proxy detected. Proceeding without proxy configuration.
2024-09-15T09:55:13.101Z - ERROR Error message: Connection error.
2024-09-15T09:55:13.101Z - ERROR Error stack trace: Error: Connection error.
at OpenAI.makeRequest (/home/jean/git/chatgpt-copilot/out/Extension.js:56648:13)
at async /home/jean/git/chatgpt-copilot/out/Extension.js:71714:21
at async RetryOperation._fn (/home/jean/git/chatgpt-copilot/out/Extension.js:22167:19)
Description
I'm trying to use
langchain
within a VS Code extension I'm working on. I'm able to get a valid response from OpenAI when using library @AI-SDK, but not when using langchain. I tried several stuff: minimal example ,streaming, non-streaming. I checked many times that my env variables was correctly setup, tried passing arguments "model" or "modelName" + apiKey, nothing works. When I send the request, it hangs for sometime and then I get a response error.When I use OpenAI from as ai-sdk I don't have the issue.
When I use langchain outside my VS Code Extension I don't have the issue too.
Also the call to OpenAI is successfully logged into my Langsmith account, and I can run the request manually successfully from there, so I'm wondering if its not an issue with a proxy or I don't know.
System Info
"yarn info langchain": yarn_info.txt
platform (windows / linux / mac): WSL2 on windows
Node version: v18.20.4
yarn version: v1.22.22
The text was updated successfully, but these errors were encountered: