We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Currently, the default model is gpt2 for text generation. It isn't good and I could not configure it to work correctly.
I used distilgpt2 and it works great out of the box. I want to create a PR and change it. @xenova What do you think?
The text was updated successfully, but these errors were encountered:
The only reason I use gpt2 as the default, is because HF uses it as a default:
https://github.com/huggingface/transformers/blob/5fd4e3c87c685fba2dd9615be62131748a8b5ee3/src/transformers/pipelines/__init__.py#LL280
I think default performance might improve once I add more generation parameters (no repeat n grams, etc.)
Sorry, something went wrong.
We recently added repetition_penalty and no_repeat_ngram_size generation parameters by the way :)
repetition_penalty
no_repeat_ngram_size
No branches or pull requests
Currently, the default model is gpt2 for text generation. It isn't good and I could not configure it to work correctly.
I used distilgpt2 and it works great out of the box. I want to create a PR and change it. @xenova What do you think?
The text was updated successfully, but these errors were encountered: