Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Maximum context length exceeded after execute_shell #3244

Closed
1 task done
gtx-cyber opened this issue Apr 25, 2023 · 4 comments · Fixed by #3222
Closed
1 task done

Maximum context length exceeded after execute_shell #3244

gtx-cyber opened this issue Apr 25, 2023 · 4 comments · Fixed by #3222

Comments

@gtx-cyber
Copy link

⚠️ Search for existing issues first ⚠️

  • I have searched the existing issues, and there is no existing issue for my problem

Which Operating System are you using?

Linux

Which version of Auto-GPT are you using?

Latest Release

GPT-3 or GPT-4?

GPT-3.5

Steps to reproduce 🕹

this error came from an installation of a library within the AutoGPT process while running
NEXT ACTION: COMMAND = execute_shell ARGUMENTS = {'command_line': 'pip install en_core_web_sm'}
Executing command 'pip install en_core_web_sm' in working directory '/home/appuser/auto_gpt_workspace'

Current behavior 😯

openai.error.InvalidRequestError: This model's maximum context length is 8191 tokens, however you requested 9956 tokens (9956 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.

And the program terminates

Expected behavior 🤔

It should auto reduce token length instead of terminating

Your prompt 📝

# Paste your prompt here

Your Logs 📒

<insert your logs here>
@sorokinvj
Copy link

I hit the same with:

NEXT ACTION:  COMMAND = execute_shell ARGUMENTS = {'command_line': 'pip list --outdated'}
Executing command 'pip list --outdated' in working directory '/Users/../Auto-GPT-0.2.2/auto_gpt_workspace'

@perrosnk
Copy link

I have experienced the same issue

@brngdsn
Copy link

brngdsn commented Apr 25, 2023

fyi i reran mine after the same kind of crash, and when prompted i told it to, "decrease token size because you keep erroring out," and i mean it worked afterwards so (i also manually accepted each prompt for a few afterwards before giving it -n)

@DMTarmey

This comment was marked as off-topic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants