-
Notifications
You must be signed in to change notification settings - Fork 960
Description
Issue with current documentation:
Hi! I wanted to bring up an issue with the way that Max Tokens and Max completion tokens are documented within the ChatOpenAI class. In the dropdown over the class header, max_tokens is shown as the example way to instantiate the ChatOpenAI class. I'll post 2 pictures that show two spots max_tokens is used. In the parameters for that class, however, there is only max_completion_tokens. I also found that anytime you use max_tokens, it gets popped and the value is written to max_completion_tokens. I know OpenAI is trying to phase out of using max_tokens, but it still accepts it as a parameter for non-reasoning models. Both of those parameters do slightly different things for OpenAI calls. At my work, we were only handling max tokens as an acceptable parameter for gpt 4o calls and discovered that max_tokens was getting overwritten and not passed through. I know that part of that is on our end for not supporting the new parameter, but based on how ChatOpenAI class suggests both parameters at different points of its documentation, we assumed that it was handling both, until further investigation was done. It was unclear that max_tokens was being removed in favor of max_completion_tokens.
Idea or request for content:
Overall, I was wondering if more documentation could be made to explain that:
- max_tokens is deprecated via openAI and is being overwritten in favor of max_completion_tokens
- langchain's implementation is not handling them as two separate parameters but instead as one (only max_completions_tokens)
- changes to the recommended example instantiation process to use max_completion_tokens instead of max_tokens as not to confuse people more; the reason being is that I thought that these were being handled as two separate parameters and was not understanding why the values I was using for max_tokens were disappearing
Or, if changes could be made to still allow max_tokens as a parameter to support legacy systems, that would be amazing.
Regardless, I hope that other people don't run into issues with this. Just want to spread awareness!