-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Closed as not planned
Labels
bugSomething isn't workingSomething isn't working
Description
Confirm this is an issue with the Python library and not an underlying OpenAI API
- This is an issue with the Python library
Describe the bug
I was following this article on stream option.https://cookbook.openai.com/examples/how_to_stream_completions.
I am using:
"2023-05-15", "gpt-35-turbo"
code:
response = open_ai_client.chat.completions.create(
model=OpenAIConstants.generator,
messages=[
{'role': 'user', 'content': "What's 1+1? Answer in one word."}
],
temperature=0,
stream=True,
stream_options={"include_usage": True}, # retrieving token usage for stream response
)
for chunk in response:
print(f"choices: {chunk.choices}\nusage: {chunk.usage}")
print("****************")
Traceback (most recent call last):
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\Archive\sample.py", line 15, in <module>
response = open_ai_client.chat.completions.create(
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\_utils\_utils.py", line 277, in wrapper
return func(*args, **kwargs)
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\resources\chat\completions.py", line 590, in create
return self._post(
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\_base_client.py", line 921, in request
return self._request(
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: stream_options', 'type': 'invalid_request_error', 'param': None, 'code': None}}
To Reproduce
I was following this article on stream option.https://cookbook.openai.com/examples/how_to_stream_completions.
I am using:
"2023-05-15", "gpt-35-turbo"
code:
response = open_ai_client.chat.completions.create(
model=OpenAIConstants.generator,
messages=[
{'role': 'user', 'content': "What's 1+1? Answer in one word."}
],
temperature=0,
stream=True,
stream_options={"include_usage": True}, # retrieving token usage for stream response
)
for chunk in response:
print(f"choices: {chunk.choices}\nusage: {chunk.usage}")
print("****************")
Traceback (most recent call last):
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\Archive\sample.py", line 15, in <module>
response = open_ai_client.chat.completions.create(
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\_utils\_utils.py", line 277, in wrapper
return func(*args, **kwargs)
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\resources\chat\completions.py", line 590, in create
return self._post(
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\_base_client.py", line 921, in request
return self._request(
File "C:\Users\nandurisai.venkatara\projects\knowledge-base\.venv\lib\site-packages\openai\_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: stream_options', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Code snippets
No response
OS
Win
Python version
3.10
Library version
1.30.1
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working