Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Content-Length can't be present with Transfer-Encoding #474

Open
davidfant opened this issue Jul 26, 2024 · 12 comments
Open

"Content-Length can't be present with Transfer-Encoding #474

davidfant opened this issue Jul 26, 2024 · 12 comments
Labels
bug Something isn't working triage

Comments

@davidfant
Copy link

davidfant commented Jul 26, 2024

What Happened?

I'm running the gateway locally using npx @portkey-ai/gateway and using the NodeJS SDK I got weird connection errors

const portkey = new Portkey({
    apiKey: " ", // need to put something here otherwise portkey will complain
    baseURL: process.env.PORTKEY_GATEWAY_URL,
    Authorization: `Bearer ${process.env.OPENAI_API_KEY!}`,
    provider: "openai",
  });

  const completion = await portkey.chat.completions.create({
    messages: [{ role: "user", content: "Hello, world! 1" }],
    model: "gpt-3.5-turbo",
    temperature: 0,
  });

The following CURL request works:

curl http://localhost:8787/v1/chat/completions -H "x-portkey-provider: openai"  -H "Content-Type: application/json"   -H "Authorization: ..."   -d '{
    "model": "gpt-3.5-turbo",
    "messages": [
      {
        "role": "system",
        "content": "Hello world"
      }
    ]
  }' | jq

But in node with either fetch, node-fetch or axios I'm always getting the error "Content-Length can't be present with Transfer-Encoding" from Portkey

axios
      .post("http://localhost:8787/v1/chat/completions", data, {
        headers: {
          "x-portkey-provider": "openai",
          "Content-Type": "application/json",
          Authorization: "Bearer " + process.env.OPENAI_API_KEY!,
        },
      })
      .then((response) => {
        console.log(response.data);
      })
      .catch((error) => {
        console.error("Error:", error);
      })

This works as expected when running using Docker

What Should Have Happened?

The Node SDK should work

Relevant Code Snippet

No response

Your Twitter/LinkedIn

https://twitter.com/da_fant

@davidfant davidfant added the bug Something isn't working label Jul 26, 2024
@narengogi
Copy link
Contributor

Hey @davidfant can you add the stack trace? Are you sure this is not a client error? I've run the code snippet in my local without any exceptions

@VisargD
Copy link
Collaborator

VisargD commented Aug 1, 2024

Hey! Can you please make sure that you are using the latest version of the gateway.

@gabrielmontagne
Copy link

Having the same problem with @portkey-ai/[email protected] and the node SDK "portkey-ai": "^1.3.2" on the node client.

APIConnectionError: Connection error.
    at OpenAI.makeRequest (file:///tmp/ttaa/node_modules/openai/core.mjs:297:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async main (file:///tmp/ttaa/index.mjs:15:26) {
  status: undefined,
  headers: undefined,
  request_id: undefined,
  error: undefined,
  code: undefined,
  param: undefined,
  type: undefined,
  cause: FetchError: request to http://127.0.0.1:8787/v1/chat/completions failed, reason: Parse Error: Content-Length can't be present with Transfer-Encoding
      at ClientRequest.<anonymous> (/tmp/ttaa/node_modules/node-fetch/lib/index.js:1501:11)
      at ClientRequest.emit (node:events:513:28)
      at Socket.socketOnData (node:_http_client:551:9)
      at Socket.emit (node:events:513:28)
      at addChunk (node:internal/streams/readable:324:12)
      at readableAddChunk (node:internal/streams/readable:297:9)
      at Readable.push (node:internal/streams/readable:234:10)
      at TCP.onStreamRead (node:internal/stream_base_commons:190:23) {
    type: 'system',
    errno: 'HPE_UNEXPECTED_CONTENT_LENGTH',
    code: 'HPE_UNEXPECTED_CONTENT_LENGTH'
  }
}

@VisargD
Copy link
Collaborator

VisargD commented Aug 5, 2024

Hey! We have identified the cause for this. Will be pushing a patch soon

@gabrielmontagne
Copy link

I wonder if you have an ETA? We've parked our integration with Portkey pending on this fix.

@gabrielmontagne
Copy link

gabrielmontagne commented Sep 27, 2024

HI. This is still happening with portkey-ai 1.4.0 with @portkey-ai/gateway 1.7.6.

@VisargD
Copy link
Collaborator

VisargD commented Sep 28, 2024

Hey @gabrielmontagne - Sorry for the inactivity on this thread. We tried reproducing this on our end with many possible combination but we were not able to reproduce it. Would you be okay with getting on a call to debug this? You can join Portkey's Discord. Once you join, we can send you a meet invite.

Discord: https://discord.com/invite/g2DhAMYFKm

@VisargD
Copy link
Collaborator

VisargD commented Sep 28, 2024

Can you please also specify the version of node that you are using?

@gabrielmontagne
Copy link

gabrielmontagne commented Sep 29, 2024

Hi @VisargD , thanks for coming back. Ah, I wasn't aware that you couldn't reproduce.
Cool, during the week I'll ping you on Discord to help you debug.

In the meantime, I've tried with all the following node versions,

       v18.16.1 *
       v18.20.4 *
->     v20.17.0 *

Perhaps crucially, I'm using the openai lib, using portkey to generate the headers -- I wanted to try the Portkey constructor, but the types went all crazy. I've tried with openai versions 4.51.0 and 4.65.0.

image

I'm using [email protected] going into http://127.0.0.1:8787/v1 @portkey-ai/[email protected]

Thanks again!

@VisargD
Copy link
Collaborator

VisargD commented Oct 1, 2024

Can you try running the gateway locally after pulling this repo? My guess is that you are currently trying the npx command.

@gabrielmontagne
Copy link

@VisargD, you were 300% right! I was using the npx command.

As you suggested, I cloned this repo and ran it locally, npm run dev:node from the default branch, and the problem disappeared! :-D

I had also tried the npx command with @latest, but perhaps it didn't include whatever fix you had already merged?

@VisargD
Copy link
Collaborator

VisargD commented Oct 1, 2024

Got it. Ideally, the executable which is used by npx command is build from the exact code of version tags. Looks like there is some behaviour differences when its used with npx command. Let me debug it further.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage
Projects
None yet
Development

No branches or pull requests

4 participants