-
Couldn't load subscription status.
- Fork 31k
Closed
Description
When running run_lm_finetuning.py to fine-tune language model with default settings (see command below), sometimes I could run successfully, but sometimes I received different errors like RuntimeError: The size of tensor a must match the size of tensor b at non-singleton dimension 1, RuntimeError: Creating MTGP constants failed. at /pytorch/aten/src/THC/THCTensorRandom.cu:35 or RuntimeError: Dimension out of range (expected to be in range of [-1, 0], but got 1). This problem can be solved when updating python3.5 to python3.6.
python run_lm_finetuning.py \
--bert_model ~/bert/models/bert-base-uncased/ \
--do_train \
--train_file ~/bert/codes/samples/sample_text.txt \
--output_dir ~/bert/exp/lm \
--num_train_epochs 5.0 \
--learning_rate 3e-5 \
--train_batch_size 32 \
--max_seq_length 128 \
--on_memory
MuruganR96 and samutammMuruganR96
Metadata
Metadata
Assignees
Labels
No labels