Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

This model's maximum context length is 8191 tokens, however you requested 89686 tokens (89686 in your prompt) #1639

Closed
1 task done
Nickonomic opened this issue Apr 15, 2023 · 3 comments 路 Fixed by #2542
Closed
1 task done

Comments

@Nickonomic
Copy link

Duplicates

  • I have searched the existing issues

Steps to reproduce 馃暪

The program is trying to process an absurd amount of information at once. It happens over and over again.

Adding chunk 17 / 20 to memory
SYSTEM: Command browse_website returned: Error: This model's maximum context length is 8191 tokens, however you requested 89686 tokens (89686 in your prompt;
0 for the completion). Please reduce your prompt; or completion length.

Current behavior 馃槸

No response

Expected behavior 馃

No response

Your prompt 馃摑

# Paste your prompt here
@SlistInc
Copy link

I think this may be related to #796 and #1211?

@endolith
Copy link
Contributor

No, this is not fixed as of a2e1669:

NEXT ACTION:  COMMAND = read_file ARGUMENTS = {'filename': './README.md'}
Enter 'y' to authorise command, 'y -N' to run N continuous commands, 'n' to exit program, or enter feedback for ...
Input:y
-=-=-=-=-=-=-= COMMAND AUTHORISED BY USER -=-=-=-=-=-=-=
Traceback (most recent call last):
  File "C:\Anaconda3\lib\runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "C:\Anaconda3\lib\runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "鈥autogpt\__main__.py", line 5, in <module>
    autogpt.cli.main()
  File "C:\Anaconda3\lib\site-packages\click\core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "C:\Anaconda3\lib\site-packages\click\core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "C:\Anaconda3\lib\site-packages\click\core.py", line 1635, in invoke
    rv = super().invoke(ctx)
  File "C:\Anaconda3\lib\site-packages\click\core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "C:\Anaconda3\lib\site-packages\click\core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "C:\Anaconda3\lib\site-packages\click\decorators.py", line 26, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "鈥autogpt\cli.py", line 177, in main
    agent.start_interaction_loop()
  File "鈥autogpt\agent\agent.py", line 213, in start_interaction_loop
    self.memory.add(memory_to_add)
  File "鈥autogpt\memory\local.py", line 76, in add
    embedding = create_embedding_with_ada(text)
  File "鈥autogpt\llm_utils.py", line 170, in create_embedding_with_ada
    return openai.Embedding.create(
  File "C:\Anaconda3\lib\site-packages\openai\api_resources\embedding.py", line 33, in create
    response = super().create(*args, **kwargs)
  File "C:\Anaconda3\lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 153, in create
    response, _, api_key = requestor.request(
  File "C:\Anaconda3\lib\site-packages\openai\api_requestor.py", line 226, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "C:\Anaconda3\lib\site-packages\openai\api_requestor.py", line 619, in _interpret_response
    self._interpret_response_line(
  File "C:\Anaconda3\lib\site-packages\openai\api_requestor.py", line 682, in _interpret_response_line
    raise self.handle_error_response(
openai.error.InvalidRequestError: This model's maximum context length is 8191 tokens, however you requested 11017 tokens (11017 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.

I have a proposed fix in #2039

@Pwuts
Copy link
Member

Pwuts commented Apr 22, 2023

@endolith you are confusing different commands as using the same text processing. They don't. For browse_website and get_text_summary, this was fixed in #2542. For issues with other commands, please don't post here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
4 participants