-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Requests after the first one error out #386
Comments
I am seeing something very similar, except that the final error message differs for me. I am seeing.... openai.error.InvalidRequestError: Additional properties are not allowed ('logprobs' was unexpected) - 'messages.2' |
I'm getting the same error (using GPT-4), after the first message:
I tried to update |
I am seeing the same error. Using MacOs Monterey v12.5 (Mackbook Pro i9 2019) Using Python3.10 Edit: I am getting this error too : openai.error.InvalidRequestError: Additional properties are not allowed ('logprobs' was unexpected) - 'messages.2' |
i have the same error also windows 10 python 3.11.5 |
The GPT API page is showing "multiple errors across all models" could that have something to do with this? |
I was using
|
As an Update, I tried checking both openai python library and interpreter versions and I had 0.28.0 version (not supported) by open interpreter version:
But after installing 0.27.8 version of openai library, it still shows the same error. |
Also runned the debug version and you can see the logprobs extra attribute:
|
Having the same issue as well: ▌ Model set to GPT-4
Tip: To run locally, use interpreter --local
Open Interpreter will require approval before running code. Use interpreter -y to bypass this.
Press CTRL-C to exit.
> interpreter -y
> yo
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.11/bin/interpreter", line 8, in <module>
sys.exit(cli())
^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/interpreter.py", line 131, in cli
cli(self)
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/cli.py", line 207, in cli
interpreter.chat()
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/interpreter.py", line 412, in chat
self.respond()
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/interpreter.py", line 636, in respond
raise Exception(error)
Exception: Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/interpreter.py", line 621, in respond
response = litellm.completion(
^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/utils.py", line 620, in wrapper
raise e
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/utils.py", line 580, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/timeout.py", line 44, in wrapper
result = future.result(timeout=local_timeout_duration)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 456, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/timeout.py", line 33, in async_func
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/main.py", line 946, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/utils.py", line 2238, in exception_type
raise e
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/utils.py", line 1741, in exception_type
raise original_exception
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/main.py", line 310, in completion
raise e
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/main.py", line 292, in completion
response = openai.ChatCompletion.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 25, in create
return super().create(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
response, _, api_key = requestor.request(
^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_requestor.py", line 298, in request
resp, got_stream = self._interpret_response(result, stream)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_requestor.py", line 700, in _interpret_response
self._interpret_response_line(
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_requestor.py", line 765, in _interpret_response_line
raise self.handle_error_response(
openai.error.InvalidRequestError: Additional properties are not allowed ('logprobs' was unexpected) - 'messages.2' |
I think this might be more related to LiteLLM / OpenAI. OpenAI doesn't support the parameter for Related issue under the OpenAI package: openai/openai-python#433 |
Update: Now instead of this problem above, I'm getting #391. Very bizarre. Update2: I still had this problem as of two minutes ago, but |
Can you run |
The issue is definitely originating in litellm: from utils.py
As always, I'm not smart enough to take this any further. |
Upgraded litellm and it works like a charm now thanx! |
@jonny7737 you just need to update the version of litellm Can you try :
You can also dm me on the Open interpreter discord, happy to hop on a call and help out |
I did the upgrade a while ago but did not include the version #. Trying now No go. Can you verify the version #? |
@ishaan-jaff Tried this, and it worked for me. I don't know why I didn't think to update the packages. Anyways, thanks a ton. |
I can confirm that upgrading litellm fixed the problem for me. Thanks everyone! |
Upgrading fixed the problem for me as well 👍 |
I think my issue is me :^) The first time I saw the 'logprobs' issue was when executing this prompt: 'start 2 python interpreters and run a 15 second count down timer in each with the remaining time printed every second. run them concurrently.' Every time I try to execute that prompt, oi crashes. So I thought, GREAT a reproducible sample. Me loop here long time. Broke out of the loop and tried other prompts - they work just fine. I seem to be trying something that is not supported. Also tried the troublesome prompt with interpreter running locally (not in a container) and interpreter did not crash (the code generated was wrong but not crash). I think the litellm update fixed the problem but my prompt was causing more issues. Thanks for the help. The prompt was not important to me. I just wanted to see what interpreter could do with it. |
I see, I'm going to close this issue as it seems the main issue was figured out. Thanks to everyone who helped. |
Describe the bug
Using the CLI
Using OpenAI GPT 3.5 and 4.0
When running interpreter, it will allow me to run one query, and any subsequent queries will return this error:
KeyError: 'role'
. Fast mode and automatic run do not help. It won't even get to writing the code, it just hangs and then waits for more input.If you have any advice or I am missing something, please let me know.
Reproduce
Expected behavior
It's expected to continue running, or not error out when I input more than once.
Screenshots
The text was updated successfully, but these errors were encountered: