-
Couldn't load subscription status.
- Fork 31k
Closed
Description
sir i was pretrained for our BERT-Base model for Multi-GPU training 8 GPUs. preprocessing succeed but next step training it shown error. in run_lm_finetuning.py.
python3 run_lm_finetuning.py --bert_model bert-base-uncased --do_train --train_file vocab007.txt --output_dir models --num_train_epochs 5.0 --learning_rate 3e-5 --train_batch_size 32 --max_seq_length 128
Traceback (most recent call last):
File "run_lm_finetuning.py", line 646, in <module>
main()
File "run_lm_finetuning.py", line 594, in main
loss = model(input_ids, segment_ids, input_mask, lm_label_ids, is_next)
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/torch/nn/modules/module.py", line 489, in __call__
result = self.forward(*input, **kwargs)
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/torch/nn/parallel/data_parallel.py", line 143, in forward
outputs = self.parallel_apply(replicas, inputs, kwargs)
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/torch/nn/parallel/data_parallel.py", line 153, in parallel_apply
return parallel_apply(replicas, inputs, kwargs, self.device_ids[:len(replicas)])
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/torch/nn/parallel/parallel_apply.py", line 83, in parallel_apply
raise output
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/torch/nn/parallel/parallel_apply.py", line 59, in _worker
output = module(*input, **kwargs)
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/torch/nn/modules/module.py", line 489, in __call__
result = self.forward(*input, **kwargs)
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/pytorch_pretrained_bert/modeling.py", line 695, in forward
output_all_encoded_layers=False)
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/torch/nn/modules/module.py", line 489, in __call__
result = self.forward(*input, **kwargs)
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/pytorch_pretrained_bert/modeling.py", line 626, in forward
embedding_output = self.embeddings(input_ids, token_type_ids)
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/torch/nn/modules/module.py", line 489, in __call__
result = self.forward(*input, **kwargs)
File "/mnt/newvolume/pytorch_bert_env/lib/python3.5/site-packages/pytorch_pretrained_bert/modeling.py", line 187, in forward
seq_length = input_ids.size(1)
RuntimeError: Dimension out of range (expected to be in range of [-1, 0], but got 1)
Thanks.
Metadata
Metadata
Assignees
Labels
No labels