Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Size troubles #13

Open
glennfu opened this issue Jun 4, 2023 · 7 comments
Open

Size troubles #13

glennfu opened this issue Jun 4, 2023 · 7 comments

Comments

@glennfu
Copy link

glennfu commented Jun 4, 2023

I'm trying to run this on a project with the following prompt:

Upgrade the Rails version to 7.0.3.1 following known migration guides to upgrade a Rails project from Rails 6 to Rails 7. Ensure that `bundle exec rspec` still passes

I've tried to trim down my files in src_include as much as possible to fit within the default limits but when I try a prompt I see:

reqJson {
  model: 'gpt-4',
  temperature: 0,
  total_tokens: 4050,
  max_tokens: 29905,
  top_p: 1,
  frequency_penalty: 0,
  presence_penalty: 0,
  rawApi: false,
  messages: [
    {
      role: 'system',
      content: "You are an assistant, who can only reply in JSON object, reply with a yes (in a param named 'reply') if you understand"
    },
    { role: 'assistant', content: '{"reply":"yes"}' },
    {
      role: 'user',
      content: '[object Object]\n' +
        '[object Object]\n' +
        '[object Object]\n' +
        '[object Object]\n' +
        '[object Object]'
    }
  ]
}
getCompletion API error {
  message: "This model's maximum context length is 8192 tokens. However, you requested 29973 tokens (68 in the messages, 29905 in the completion). Please reduce the length of the messages or completion.",
  type: 'invalid_request_error',
  param: 'messages',
  code: 'context_length_exceeded'
}

I'm not sure how to debug further here.

@glennfu
Copy link
Author

glennfu commented Jun 4, 2023

Also, aside from the actual error, I feel like there's a good opportunity for the AI here. If ai-smol-dev fed my prompt to the AI and tried to figure out what to do next, it would probably realize it didn't need all my files yet until much later.

If I were to give this task to a junior developer, the ONLY file they would initially need is the Gemfile. They'd go in, change the version number of rails, and then run bundle update rails. The output, assuming there were problems, would provide them with clues on what to do next without ever needing to open another file. It's very likely they'd be able to resolve any conflicts here without opening or reading any other files. Once that step was complete, they'd run bundle exec rspec to run the tests. They'd read the bottom of the error output to determine what to do next. At that point they'd see stack traces referencing file numbers they may want to read. A reasonable strategy would then be to run the first failing spec by itself, take the entire output, and feed the output and the content of the files from the stack trace to the model. Until then, it wouldn't have needed the content of any other files but Gemfile.

@PicoCreator
Copy link
Owner

Thats wierd that you are seeing [Object Object], sounds like there is a bug somewhere

If you are having large projects, strongly suggest applying for anthropic claude 100k access. As that generally resolve most size sises

@shannonlal
Copy link

@glennfu I ran into something similar but I tried increasing the file size in the config and that worked for me

  "limits": {
    "FILE_LIST":10000
  },

@PicoCreator
Copy link
Owner

How many files are you pushing O_o

@glennfu
Copy link
Author

glennfu commented Jun 8, 2023

@shannonlal I had actually done the same but in an effort to narrow down the true problem I excluded more files so that I wouldn't go over the default file list limit.

@glennfu
Copy link
Author

glennfu commented Jun 10, 2023

Thats wierd that you are seeing [Object Object], sounds like there is a bug somewhere

If you are having large projects, strongly suggest applying for anthropic claude 100k access. As that generally resolve most size sises

Oh also I applied for Claude what feels like a long time ago, but still haven't gotten access.

@glennfu
Copy link
Author

glennfu commented Jun 10, 2023

@PicoCreator As another followup I made a new blank project (literally 0 files), typed up a README.md, switched to node 19, then fired it up and got this:

reqJson {"model":"gpt-4","temperature":0,"total_tokens":4050,"max_tokens":29905,"top_p":1,"frequency_penalty":0,"presence_penalty":0,"rawApi":false,"messages":[{"role":"system","content":"You are an assistant, who can only reply in JSON object, reply with a yes (in a param named 'reply') if you understand"},{"role":"assistant","content":"{\"reply\":\"yes\"}"},{"role":"user","content":"[object Object]\n[object Object]\n[object Object]\n[object Object]\n[object Object]"}]}
getCompletion API error {
  message: "This model's maximum context length is 8192 tokens. However, you requested 29973 tokens (68 in the messages, 29905 in the completion). Please reduce the length of the messages or completion.",
  type: 'invalid_request_error',
  param: 'messages',
  code: 'context_length_exceeded'
}

I'm starting to think this isn't so much a size problem as a bug. I tried npm install smol-dev-js and I also build from source and linked it with npm install && npm link to test that way, both attempts gave the same error. node -v shows v19.8.1. Let me know if there's anything else I can do try and debug this further.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants