Skip to content
This repository has been archived by the owner on Aug 10, 2023. It is now read-only.

Support set max_tokens in the V3 ask api #1445

Merged
merged 1 commit into from
Jun 30, 2023

Conversation

gchust
Copy link
Contributor

@gchust gchust commented Jun 30, 2023

I want to limit the response of openai api. Currently in revChatGPT, the max_tokens was calculated automatically.

Commit messages generated by chatgpt:
feat(V3.py): add support for max_tokens parameter in Chatbot class to limit the number of tokens in a response
refactor(V3.py): use min function to choose the smaller value between the current max_tokens and the new max_tokens parameter to ensure the response does not exceed the desired length

… limit the number of tokens in a response

refactor(V3.py): use min function to choose the smaller value between the current max_tokens and the new max_tokens parameter to ensure the response does not exceed the desired length
@acheong08 acheong08 merged commit b4bf62d into acheong08:main Jun 30, 2023
5 checks passed
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants