New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
400 - Bad request - Maximum content length. #203
Comments
|
I'm wondering if I set something up wrong as that diff doesn't seem especially large? |
|
@rocktimsaikia Wondering if we need to set a max here depending on the model? Line 166 in f53fb85
I think we can get the max for each model from tiktoken. |
|
Exact same issue here. It was working fine last night with larger diffs but this morning I've done some minimal work and run into this error. Strange. |
|
Same here |
|
I haven't had time to investigate this yet, but if anyone wants to, feel free to open a PR. I think it's basically setting a limit here: Line 166 in f53fb85
|
|
@privatenumber Sorry to report I've updated to 1.11.0 and I've still having the same issue. |
|
That's not the same error: - "message": "This model's maximum context length is 4097 tokens. However, you requested 5838 tokens (2909 in the messages, 2929 in the completion). Please reduce the length of the messages or completion.",
+ "message": "This model's maximum context length is 4097 tokens. However, your messages resulted in 8332 tokens. Please reduce the length of the messages.",Previously, we were requesting more tokens than we can. Which was just fixed. In your case, your diff is too large. Would you mind filing a new issue for this? |
This comment was marked as outdated.
This comment was marked as outdated.
|
@privatenumber Done - thank you. |
Bug description
When running the command I get the following:
◇ Detected 2 staged files: tests/Fixtures/TicketSource/events.index.json tests/Unit/TicketSourceAPITest.php │ ◇ Changes analyzed │ └ ✖ OpenAI API Error: 400 - Bad Request { "error": { "message": "This model's maximum context length is 4097 tokens. However, you requested 5838 tokens (2909 in the messages, 2929 in the completion). Please reduce the length of the messages or completion.", "type": "invalid_request_error", "param": "messages", "code": "context_length_exceeded" } }My changes aren't particularly big. I've create a text file and added a single new test to my test suite.
aicommits version
1.10.0
Environment
System: OS: Linux 4.15 Ubuntu 18.04.6 LTS (Bionic Beaver) CPU: (4) x64 DO-Regular Memory: 3.75 GB / 7.79 GB Container: Yes Shell: 4.4.20 - /bin/bash Binaries: Node: 16.16.0 - /usr/local/bin/node Yarn: 1.19.1 - /usr/bin/yarn npm: 8.11.0 - /usr/local/bin/npmCan you contribute a fix?
The text was updated successfully, but these errors were encountered: