Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow max_tokens = -1 for ChatOpenAI #1532

Closed
adlindenberg opened this issue Mar 8, 2023 · 2 comments · Fixed by #1782
Closed

Allow max_tokens = -1 for ChatOpenAI #1532

adlindenberg opened this issue Mar 8, 2023 · 2 comments · Fixed by #1782

Comments

@adlindenberg
Copy link

The Chat API allows for not passing a max_tokens param and it's supported for other LLMs in langchain by passing -1 as the value. Could you extend support to the ChatOpenAI model? Something like the image seems to work?

image

@Aratako
Copy link
Contributor

Aratako commented Mar 18, 2023

This solution seems to effectively resolve the issue.
Do you plan to create a pull request yourself? If not, I can make the necessary changes. Would that be alright?

@adlindenberg
Copy link
Author

@Aratako that'd be fine with me, thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants