Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

max_tokens parameter #25

Closed
marioseixas opened this issue Mar 7, 2023 · 0 comments · Fixed by #26
Closed

max_tokens parameter #25

marioseixas opened this issue Mar 7, 2023 · 0 comments · Fixed by #26

Comments

@marioseixas
Copy link
Contributor

Hello

congratz on this awesome tool

Please remove the max_tokens parameter, so it could default to (4096 - prompt tokens)

You should not set the max_token, according to the official openAI API, the default value is Inf, and the model can process up to 4096 tokens of information.

222904560-73acded3-006f-4b98-bd91-cba58d91e45a

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant