Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow max_tokens=-1 for ChatOpenAI and set the default value to -1 #1771

Conversation

Aratako
Copy link
Contributor

@Aratako Aratako commented Mar 19, 2023

Resolves #1532, resolves #1652.

In the current implementation of ChatOpenAI, the default value for max_tokens is set to 256, and it cannot take a value of -1.
In this PR, we will allow max_tokens to take a value of -1. When max_tokens=-1, the max_tokens parameter will be omitted, which means there will be no token limit for the response.

Originally, the default value of max_tokens was set to 256, causing longer responses to be cut off and creating confusion (as in issue #1652). This PR sets the default value of max_tokens to -1, preventing cutoffs from occurring.

The solution was originally proposed by @adlindenberg in the issue discussion (#1532). Thank you, @adlindenberg, for providing the idea for this fix.

@Aratako Aratako changed the title Allow max tokens=-1 for ChatOpenAI and set the default value to -1 Allow max_tokens=-1 for ChatOpenAI and set the default value to -1 Mar 19, 2023
Copy link
Contributor

@hwchase17 hwchase17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think this can be done more easily by just setting it be None?

@hwchase17
Copy link
Contributor

#1782

@Aratako
Copy link
Contributor Author

Aratako commented Mar 19, 2023

Yeah it seems working well. Thank you for your feedback.
I will close this PR now.

@Aratako Aratako closed this Mar 19, 2023
@Aratako Aratako deleted the allow_max_tokens_=_-1_for_ChatOpenAI branch March 19, 2023 17:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Output cutoff with ChatOpenAI Allow max_tokens = -1 for ChatOpenAI
2 participants