You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Which indicates empty 'max tokens' and 'temperature' settings, as they were left as default null. It seems to me that OpenAI ChatGPT does not allow this
The text was updated successfully, but these errors were encountered:
Was your model set as 'gpt-3.5-turbo' and did you have max_token set near 4096?
There might be an error such as:
This model's maximum context length is 4097 tokens. However, you requested 4115 tokens (19 in the messages, 4096 in the completion). Please reduce the length of the messages or completion.
I did not display this error which is why you may have received an empty response.
I will display the error in the next version. Thanks for helping me see this issue :)
If this is not the case, please let me know.
btw, max_tokens and temperature can be left as null. If it is null, max_tokens defaults to inf and temperature defaults to 1.
Which indicates empty 'max tokens' and 'temperature' settings, as they were left as default null. It seems to me that OpenAI ChatGPT does not allow this
The text was updated successfully, but these errors were encountered: