Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Version 1.4.3 is causing chat to not generate responses #10

Closed
kikkoi opened this issue Sep 11, 2023 · 1 comment
Closed

Version 1.4.3 is causing chat to not generate responses #10

kikkoi opened this issue Sep 11, 2023 · 1 comment

Comments

@kikkoi
Copy link

kikkoi commented Sep 11, 2023

Which indicates empty 'max tokens' and 'temperature' settings, as they were left as default null. It seems to me that OpenAI ChatGPT does not allow this

@longy2k
Copy link
Owner

longy2k commented Sep 14, 2023

Was your model set as 'gpt-3.5-turbo' and did you have max_token set near 4096?

There might be an error such as:

This model's maximum context length is 4097 tokens. However, you requested 4115 tokens (19 in the messages, 4096 in the completion). Please reduce the length of the messages or completion.

I did not display this error which is why you may have received an empty response.

I will display the error in the next version. Thanks for helping me see this issue :)

If this is not the case, please let me know.

btw, max_tokens and temperature can be left as null. If it is null, max_tokens defaults to inf and temperature defaults to 1.

@longy2k longy2k closed this as completed Nov 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants