We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can I generate on batches with GPT-2 if I want to make use of these awesome sampling techniques: top-k sampling and top-p sampling?.
There is this implementation for generating phrases on batches already in issue #3021.
Any advice? thanks!
@patrickvonplaten
The text was updated successfully, but these errors were encountered:
It's on our ToDo-List :-) Currently batch generation with GPT2 is not possible, so you will have to rely on the code in #3021
Sorry, something went wrong.
Can top-k and top-p sampling be implemented in batches?
Sure, the provided top-k-top-p sampling function provided that :-)
top-k-top-p sampling
No branches or pull requests
How can I generate on batches with GPT-2 if I want to make use of these awesome sampling techniques: top-k sampling and top-p sampling?.
There is this implementation for generating phrases on batches already in issue #3021.
Any advice? thanks!
@patrickvonplaten
The text was updated successfully, but these errors were encountered: