Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Top-k sampling and top-p sampling for generating phrases on batches with GPT-2? #4824

Closed
Barbara931120 opened this issue Jun 6, 2020 · 3 comments

Comments

@Barbara931120
Copy link

How can I generate on batches with GPT-2 if I want to make use of these awesome sampling techniques: top-k sampling and top-p sampling?.

There is this implementation for generating phrases on batches already in issue #3021.

Any advice? thanks!

@patrickvonplaten

@patrickvonplaten
Copy link
Contributor

It's on our ToDo-List :-) Currently batch generation with GPT2 is not possible, so you will have to rely on the code in #3021

@Barbara931120
Copy link
Author

Can top-k and top-p sampling be implemented in batches?

@patrickvonplaten
Copy link
Contributor

Sure, the provided top-k-top-p sampling function provided that :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants