Allow max_tokens
=-1 for ChatOpenAI
and set the default value to -1
#1771
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Resolves #1532, resolves #1652.
In the current implementation of
ChatOpenAI
, the default value for max_tokens is set to 256, and it cannot take a value of -1.In this PR, we will allow
max_tokens
to take a value of -1. Whenmax_tokens
=-1, themax_tokens
parameter will be omitted, which means there will be no token limit for the response.Originally, the default value of
max_tokens
was set to 256, causing longer responses to be cut off and creating confusion (as in issue #1652). This PR sets the default value ofmax_tokens
to -1, preventing cutoffs from occurring.The solution was originally proposed by @adlindenberg in the issue discussion (#1532). Thank you, @adlindenberg, for providing the idea for this fix.