-
Notifications
You must be signed in to change notification settings - Fork 749
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to set batch size of training? #21
Comments
I successfully set it using |
Good work! |
Is there any instruction on how to set |
You should only need to set @nshazeer FYI |
Thanks, but still, I don't figure out Is the default value |
The newer versions of the T5 library simply ignore `--gin_param="tokens_per_batch = 65536" \`: google-research/text-to-text-transfer-transformer#21
The newer versions of the T5 library simply ignore `--gin_param="tokens_per_batch = 65536" \`: google-research/text-to-text-transfer-transformer#21
The newer versions of the T5 library simply ignore --gin_param="tokens_per_batch = 65536" \: google-research/text-to-text-transfer-transformer#21
The newer versions of the T5 library simply ignore --gin_param="tokens_per_batch = 65536" \: google-research/text-to-text-transfer-transformer#21
When I try to change the batch size using
--gin_param="sequences_per_batch=128"
or--gin_param="tokens_per_batch=65536"
, the batch size seems always to be 32?INFO:tensorflow:serialize_num_microbatches: tokens_per_microbatch_per_replica=2048 batch_dim=Dimension(name='batch', size=32) sequence_length={'inputs': 512, 'targets': 114} batch_per_replica=4 num_microbatches=1 I1208 11:05:22.407459 140391696871040 utils.py:1440] serialize_num_microbatches: tokens_per_microbatch_per_replica=2048 batch_dim=Dimension(name='batch', size=32) sequence_length={'inputs': 512, 'targets': 114} batch_per_replica=4 num_microbatches=1
The text was updated successfully, but these errors were encountered: