We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transformers
@ArthurZucker and @younesbelkada
examples
The following code breaks in v4.41.0 (it works on earlier versions).
v4.41.0
import torch from transformers import GenerationConfig from transformers import T5ForConditionalGeneration model = T5ForConditionalGeneration.from_pretrained( "google/t5-efficient-tiny", device_map="cuda" ) input_ids = torch.tensor([[4, 5, 6, 6, 7]], device="cuda") model.generate( input_ids=input_ids, generation_config=GenerationConfig(do_sample=True), )
Error:
ValueError: `decoder_start_token_id` or `bos_token_id` has to be defined for encoder-decoder generation.
Expected generate to work like before without manually specifying decoder_start_token_id or bos_token_id in the GenerationConfig.
decoder_start_token_id
bos_token_id
GenerationConfig
The text was updated successfully, but these errors were encountered:
Thanks @abdulfatir ! #30899 from @zucchini-nlp should fix the issue ! :)
Sorry, something went wrong.
Successfully merging a pull request may close this issue.
System Info
transformers
version: 4.41.0Who can help?
@ArthurZucker and @younesbelkada
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
The following code breaks in
v4.41.0
(it works on earlier versions).Error:
Expected behavior
Expected generate to work like before without manually specifying
decoder_start_token_id
orbos_token_id
in theGenerationConfig
.The text was updated successfully, but these errors were encountered: