You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Every time I try to chat I get an error about exceeding the number of tokens. Example error message:
Error: error in request:{"error":{"message":"This model's maximum context length is 4096 tokens. However, you requested 4833 tokens (4433 in the messages, 400 in the completion). Please reduce the length of either one, or use the "middle-out" transform to compress your prompt automatically.","code":400}}
I tried different characters, including creating my own (which has 1664 tokens in prompts). From the beginning everything is fine, but then after a few short messages there is again an error of exceeding tokens, which means the number of tokens is filled by simple chatting? The more I write in the chat, the more tokens go beyond the limit? How it works? What is needed to solve the problem?
The text was updated successfully, but these errors were encountered:
Every time I try to chat I get an error about exceeding the number of tokens. Example error message:
Error: error in request:{"error":{"message":"This model's maximum context length is 4096 tokens. However, you requested 4833 tokens (4433 in the messages, 400 in the completion). Please reduce the length of either one, or use the "middle-out" transform to compress your prompt automatically.","code":400}}
I tried different characters, including creating my own (which has 1664 tokens in prompts). From the beginning everything is fine, but then after a few short messages there is again an error of exceeding tokens, which means the number of tokens is filled by simple chatting? The more I write in the chat, the more tokens go beyond the limit? How it works? What is needed to solve the problem?
The text was updated successfully, but these errors were encountered: