-
Notifications
You must be signed in to change notification settings - Fork 43.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
This model's maximum context length is 8191 tokens, however you requested 89686 tokens (89686 in your prompt) #1639
Labels
Comments
5 tasks
5 tasks
No, this is not fixed as of a2e1669:
I have a proposed fix in #2039 |
1 task
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Duplicates
Steps to reproduce 馃暪
The program is trying to process an absurd amount of information at once. It happens over and over again.
Adding chunk 17 / 20 to memory
SYSTEM: Command browse_website returned: Error: This model's maximum context length is 8191 tokens, however you requested 89686 tokens (89686 in your prompt;
0 for the completion). Please reduce your prompt; or completion length.
Current behavior 馃槸
No response
Expected behavior 馃
No response
Your prompt 馃摑
# Paste your prompt here
The text was updated successfully, but these errors were encountered: