Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code stuck infinitely when performing Fine-Tuning #25

Closed
akash-isu opened this issue Feb 11, 2021 · 13 comments
Closed

Code stuck infinitely when performing Fine-Tuning #25

akash-isu opened this issue Feb 11, 2021 · 13 comments

Comments

@akash-isu
Copy link

When running the fine-tune operation, the script gets stuck at the following warning.

Epoch: 0%| | 0/8 [00:00<?, ?it/s]/home/akash/.local/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:224: UserWarning: To get the last learning rate computed by the scheduler, please use get_last_lr().
warnings.warn("To get the last learning rate computed by the scheduler, "

@guoday
Copy link
Contributor

guoday commented Feb 13, 2021

It seems that this warning has no effect on fine-tuning. I am not sure why the code will stuck infinitely. Maybe you can update your torch and transformers using "pip install --upgrade install torch transformers"

@akash-isu
Copy link
Author

So this happens only when performing fine-tuning for python. The process gets Killed after a long time with the epoch at either 0/8 or 1/8.
Doesn't happen for any of the other languages.

@guoday
Copy link
Contributor

guoday commented Feb 18, 2021

@fengzhangyin Hi, zhangyin. Do you have any idea about this issue?

@akash-isu
Copy link
Author

Same thing happens when running the run.py script in code2nl

@guoday
Copy link
Contributor

guoday commented Feb 20, 2021

Only for python language in code2nl? Can you share me with all log in code2nl?

@akash-isu
Copy link
Author

akash-isu commented Feb 20, 2021

For code2nl it happens for all languages. Here's the terminal logs for the run on Javascript

02/19/2021 21:17:44 - INFO - main - Namespace(adam_epsilon=1e-08, beam_size=10, config_name='', dev_filename='../data/code2nl/CodeSearchNet//php/valid.jsonl', do_eval=True, do_lower_case=False, do_test=False, do_train=True, eval_batch_size=64, eval_steps=600, gradient_accumulation_steps=1, learning_rate=5e-05, load_model_path=None, local_rank=-1, max_grad_norm=1.0, max_source_length=256, max_steps=-1, max_target_length=128, model_name_or_path='microsoft/codebert-base', model_type='roberta', no_cuda=True, num_train_epochs=3.0, output_dir='model/php', seed=42, test_filename=None, tokenizer_name='', train_batch_size=64, train_filename='../data/code2nl/CodeSearchNet//php/train.jsonl', train_steps=30000, warmup_steps=0, weight_decay=0.0)
02/19/2021 21:17:44 - WARNING - main - Process rank: -1, device: cpu, n_gpu: 1, distributed training: False
02/19/2021 21:17:45 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/codebert-base/config.json from cache at /home/adutta/.cache/torch/transformers/1b62771d5f5169b34713b0af1ab85d80e11f7b1812fbf3ee7d03a866c5f58e72.06eb31f0a63f4e8a136733ccac422f0abf9ffa87c3e61104b57e7075a704d008
02/19/2021 21:17:45 - INFO - transformers.configuration_utils - Model config RobertaConfig {
"architectures": [
"RobertaModel"
],
"attention_probs_dropout_prob": 0.1,
"bos_token_id": 0,
"do_sample": false,
"eos_token_id": 2,
"eos_token_ids": 0,
"finetuning_task": null,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"id2label": {
"0": "LABEL_0",
"1": "LABEL_1"
},
"initializer_range": 0.02,
"intermediate_size": 3072,
"is_decoder": false,
"label2id": {
"LABEL_0": 0,
"LABEL_1": 1
},
"layer_norm_eps": 1e-05,
"length_penalty": 1.0,
"max_length": 20,
"max_position_embeddings": 514,
"model_type": "roberta",
"num_attention_heads": 12,
"num_beams": 1,
"num_hidden_layers": 12,
"num_labels": 2,
"num_return_sequences": 1,
"output_attentions": false,
"output_hidden_states": false,
"output_past": true,
"pad_token_id": 1,
"pruned_heads": {},
"repetition_penalty": 1.0,
"temperature": 1.0,
"top_k": 50,
"top_p": 1.0,
"torchscript": false,
"type_vocab_size": 1,
"use_bfloat16": false,
"vocab_size": 50265
}

02/19/2021 21:17:45 - INFO - transformers.tokenization_utils - Model name 'microsoft/codebert-base' not found in model shortcut name list (roberta-base, roberta-large, roberta-large-mnli, distilroberta-base, roberta-base-openai-detector, roberta-large-openai-detector). Assuming 'microsoft/codebert-base' is a path, a model identifier, or url to a directory containing tokenizer files.
02/19/2021 21:17:46 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/codebert-base/vocab.json from cache at /home/adutta/.cache/torch/transformers/aca4dbdf4f074d4e071c2664901fec33c8aa69c35aa0101bc669ed4b44d1f6c3.6a4061e8fc00057d21d80413635a86fdcf55b6e7594ad9e25257d2f99a02f4be
02/19/2021 21:17:46 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/codebert-base/merges.txt from cache at /home/adutta/.cache/torch/transformers/779a2f0c38ba2ff65d9a3ee23e58db9568f44a20865c412365e3dc540f01743f.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda
02/19/2021 21:17:46 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/codebert-base/added_tokens.json from cache at None
02/19/2021 21:17:46 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/codebert-base/special_tokens_map.json from cache at /home/adutta/.cache/torch/transformers/5a191080da4f00859b5d3d29529f57894583e00ab07b7c940d65c33db4b25d4d.16f949018cf247a2ea7465a74ca9a292212875e5fd72f969e0807011e7f192e4
02/19/2021 21:17:46 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/codebert-base/tokenizer_config.json from cache at /home/adutta/.cache/torch/transformers/1b4723c5fb2d933e11c399450ea233aaf33f093b5cbef3ec864624735380e490.70b5dbd5d3b9b4c9bfb3d1f6464291ff52f6a8d96358899aa3834e173b45092d
02/19/2021 21:17:46 - INFO - transformers.modeling_utils - loading weights file https://s3.amazonaws.com/models.huggingface.co/bert/microsoft/codebert-base/pytorch_model.bin from cache at /home/adutta/.cache/torch/transformers/3416309b564f60f87c1bc2ce8d8a82bb7c1e825b241c816482f750b48a5cdc26.96251fe4478bac0cff9de8ae3201e5847cee59aebbcafdfe6b2c361f9398b349
02/19/2021 21:17:54 - INFO - main - *** Example ***
02/19/2021 21:17:54 - INFO - main - idx: 0
02/19/2021 21:17:54 - INFO - main - source_tokens: ['', 'public', 'function', 'on', 'Channel', 'Pre', 'Delete', '(', 'Resource', 'Controller', 'Event', '$', 'event', ')', ':', 'void', '{', '$', 'channel', '=', '$', 'event', '->', 'get', 'Subject', '(', ')', ';', 'if', '(', '!', '$', 'channel', 'instance', 'of', 'Channel', 'Interface', ')', '{', 'throw', 'new', 'U', 'nexpected', 'Type', 'Exception', '(', '$', 'channel', ',', 'Channel', 'Interface', '::', 'class', ')', ';', '}', '$', 'results', '=', '$', 'this', '->', 'channel', 'Rep', 'ository', '->', 'find', 'By', '(', '[', "'", 'enabled', "'", '=>', 'true', ']', ')', ';', 'if', '(', '!', '$', 'results', '||', '(', 'count', '(', '$', 'results', ')', '===', '1', '&&', 'current', '(', '$', 'results', ')', '===', '$', 'channel', ')', ')', '{', '$', 'event', '->', 'stop', '(', "'", 'sy', 'l', 'ius', '.', 'channel', '.', 'delete', '', 'error', "'", ')', ';', '}', '}', '']
02/19/2021 21:17:54 - INFO - main - source_ids: 0 15110 5043 15 17305 22763 46006 36 13877 48321 44879 68 515 4839 4832 13842 25522 68 4238 5457 68 515 43839 120 47159 36 4839 25606 114 36 27785 68 4238 4327 1116 5331 49136 4839 25522 3211 92 121 42537 40118 48847 36 68 4238 2156 5331 49136 35965 1380 4839 25606 35524 68 775 5457 68 42 43839 4238 22026 39415 43839 465 2765 36 646 128 23949 108 46161 1528 27779 4839 25606 114 36 27785 68 775 45056 36 3212 36 68 775 4839 47408 112 48200 595 36 68 775 4839 47408 68 4238 4839 4839 25522 68 515 43839 912 36 128 8628 462 6125 4 27681 4 46888 1215 44223 108 4839 25606 35524 35524 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - source_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:17:54 - INFO - main - target_tokens: ['', 'Pre', 'vent', 'channel', 'deletion', 'if', 'no', 'more', 'channels', 'enabled', '.', '']
02/19/2021 21:17:54 - INFO - main - target_ids: 0 22763 9399 4238 43762 114 117 55 6237 9778 479 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - target_mask: 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:17:54 - INFO - main - *** Example ***
02/19/2021 21:17:54 - INFO - main - idx: 1
02/19/2021 21:17:54 - INFO - main - source_tokens: ['', 'public', 'function', 'get', 'Tax', 'Total', '(', ')', '
:', 'int', '{', '$', 'tax', 'Total', '=', '0', ';', 'fore', 'ach', '(', '$', 'this', '->', 'get', 'Adjust', 'ments', '(', 'Adjust', 'ment', 'Interface', '::', 'TA', 'X', '', 'AD', 'JUST', 'MENT', ')', 'as', '$', 'tax', 'Adjust', 'ment', ')', '{', '$', 'tax', 'Total', '+=', '$', 'tax', 'Adjust', 'ment', '->', 'get', 'Amount', '(', ')', ';', '}', 'fore', 'ach', '(', '$', 'this', '->', 'units', 'as', '$', 'unit', ')', '{', '$', 'tax', 'Total', '+=', '$', 'unit', '->', 'get', 'Tax', 'Total', '(', ')', ';', '}', 'return', '$', 'tax', 'Total', ';', '}', '']
02/19/2021 21:17:54 - INFO - main - source_ids: 0 15110 5043 120 29386 37591 36 4839 4832 6979 25522 68 629 37591 5457 321 25606 4899 1488 36 68 42 43839 120 46493 2963 36 29726 1757 49136 35965 16667 1000 1215 2606 43845 12613 4839 25 68 629 46493 1757 4839 25522 68 629 37591 49371 68 629 46493 1757 43839 120 48302 36 4839 25606 35524 4899 1488 36 68 42 43839 2833 25 68 1933 4839 25522 68 629 37591 49371 68 1933 43839 120 29386 37591 36 4839 25606 35524 671 68 629 37591 25606 35524 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - source_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:17:54 - INFO - main - target_tokens: ['', 'Returns', 'sum', 'of', 'neutral', 'and', 'non', 'neutral', 'tax', 'adjustments', 'on', 'order', 'item', 'and', 'total', 'tax', 'of', 'units', '.', '']
02/19/2021 21:17:54 - INFO - main - target_ids: 0 48826 6797 9 7974 8 786 7974 629 11431 15 645 6880 8 746 629 9 2833 479 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - target_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:17:54 - INFO - main - *** Example ***
02/19/2021 21:17:54 - INFO - main - idx: 2
02/19/2021 21:17:54 - INFO - main - source_tokens: ['', 'private', 'function', 'is', 'Last', 'Enabled', 'Entity', '(', '$', 'result', ',', '
$', 'entity', ')', ':', 'bool', '{', 'return', '!', '$', 'result', '||', '0', '===', 'count', '(', '$', 'result', ')', '||', '(', '1', '===', 'count', '(', '$', 'result', ')', '&&', '$', 'entity', '===', '(', '$', 'result', 'instance', 'of', '\', 'Iter', 'ator', '?', '$', 'result', '->', 'current', '(', ')', ':', 'current', '(', '$', 'result', ')', ')', ')', ';', '}', '']
02/19/2021 21:17:54 - INFO - main - source_ids: 0 22891 5043 16 10285 48582 49448 36 68 898 2156 68 10014 4839 4832 49460 25522 671 27785 68 898 45056 321 47408 3212 36 68 898 4839 45056 36 112 47408 3212 36 68 898 4839 48200 68 10014 47408 36 68 898 4327 1116 44128 47476 2630 17487 68 898 43839 595 36 4839 4832 595 36 68 898 4839 4839 4839 25606 35524 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - source_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:17:54 - INFO - main - target_tokens: ['', 'If', 'no', 'entity', 'matched', 'the', 'query', 'criteria', 'or', 'a', 'single', 'entity', 'matched', 'which', 'is', 'the', 'same', 'as', 'the', 'entity', 'being', 'validated', 'the', 'entity', 'is', 'the', 'last', 'enabled', 'entity', 'available', '.', '']
02/19/2021 21:17:54 - INFO - main - target_ids: 0 1106 117 10014 9184 5 25860 8608 50 10 881 10014 9184 61 16 5 276 25 5 10014 145 29548 5 10014 16 5 94 9778 10014 577 479 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - target_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:17:54 - INFO - main - *** Example ***
02/19/2021 21:17:54 - INFO - main - idx: 3
02/19/2021 21:17:54 - INFO - main - source_tokens: ['', 'protected', 'function', 'recal', 'cul', 'ate', 'Total', '(', ')', '
:', 'void', '{', '$', 'this', '->', 'total', '=', '$', 'this', '->', 'items', 'Total', '+', '$', 'this', '->', 'adjustments', 'Total', ';', 'if', '(', '$', 'this', '->', 'total', '<', '0', ')', '{', '$', 'this', '->', 'total', '=', '0', ';', '}', '}', '']
02/19/2021 21:17:54 - INFO - main - source_ids: 0 37659 5043 34973 13300 877 37591 36 4839 4832 13842 25522 68 42 43839 746 5457 68 42 43839 1964 37591 2055 68 42 43839 11431 37591 25606 114 36 68 42 43839 746 28696 321 4839 25522 68 42 43839 746 5457 321 25606 35524 35524 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - source_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:17:54 - INFO - main - target_tokens: ['', 'Items', 'total', '+', 'Adjust', 'ments', 'total', '.', '']
02/19/2021 21:17:54 - INFO - main - target_ids: 0 48485 746 2055 29726 2963 746 479 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - target_mask: 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:17:54 - INFO - main - *** Example ***
02/19/2021 21:17:54 - INFO - main - idx: 4
02/19/2021 21:17:54 - INFO - main - source_tokens: ['', 'public', 'function', 'login', 'Action', '(', 'Request', '$', 'request', ')', ':', 'Response', '{', '
$', 'authentication', 'Ut', 'ils', '=', '$', 'this', '->', 'get', '(', "'", 'security', '.', 'authent', 'ication', '', 'utils', "'", ')', ';', '$', 'error', '=', '$', 'authentication', 'Ut', 'ils', '->', 'get', 'Last', 'Authent', 'ication', 'Error', '(', ')', ';', '$', 'last', 'Us', 'ername', '=', '$', 'authentication', 'Ut', 'ils', '->', 'get', 'Last', 'Us', 'ername', '(', ')', ';', '$', 'options', '=', '$', 'request', '->', 'attributes', '->', 'get', '(', "'", '', 'sy', 'l', 'ius', "'", ')', ';', '$', 'template', '=', '$', 'options', '[', "'", 'template', "'", ']', '??', 'null', ';', 'Ass', 'ert', '::', 'not', 'Null', '(', '$', 'template', ',', "'", 'Template', 'is', 'not', 'configured', ".'", ')', ';', '$', 'form', 'Type', '=', '$', 'options', '[', "'", 'form', "'", ']', '??', 'User', 'Login', 'Type', '::', 'class', ';', '$', 'form', '=', '$', 'this', '->', 'get', '(', "'", 'form', '.', 'f', 'actory', "'", ')', '->', 'create', 'N', 'amed', '(', "''", ',', '$', 'form', 'Type', ')', ';', 'return', '$', 'this', '->', 'render', '(', '$', 'template', ',', '[', "'", 'form', "'", '=>', '$', 'form', '->', 'create', 'View', '(', ')', ',', "'", 'last', '', 'username', "'", '=>', '$', 'last', 'Us', 'ername', ',', "'", 'error', "'", '=>', '$', 'error', ',', ']', ')', ';', '_}', '']
02/19/2021 21:17:54 - INFO - main - source_ids: 0 15110 5043 27754 36082 36 18593 68 2069 4839 4832 19121 25522 68 24790 41967 5290 5457 68 42 43839 120 36 128 15506 4 40907 14086 1215 49320 108 4839 25606 68 5849 5457 68 24790 41967 5290 43839 120 10285 48151 14086 30192 36 4839 25606 68 94 16419 48285 5457 68 24790 41967 5290 43839 120 10285 16419 48285 36 4839 25606 68 1735 5457 68 2069 43839 16763 43839 120 36 128 1215 8628 462 6125 108 4839 25606 68 27663 5457 68 1735 646 128 48790 108 27779 42254 23796 25606 6331 2399 35965 45 49302 36 68 27663 2156 128 49522 16 45 36614 955 4839 25606 68 1026 40118 5457 68 1735 646 128 3899 108 27779 42254 27913 43890 40118 35965 1380 25606 68 1026 5457 68 42 43839 120 36 128 3899 4 506 27670 108 4839 43839 1045 487 7486 36 12801 2156 68 1026 40118 4839 25606 671 68 42 43839 19930 36 68 27663 2156 646 128 3899 108 46161 68 1026 43839 1045 22130 36 4839 2156 128 13751 1215 48852 108 46161 68 94 16419 48285 2156 128 44223 108 46161 68 5849 2156 27779 4839 25606 35524 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - source_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:17:54 - INFO - main - target_tokens: ['', 'Login', '_form', 'action', '.', '']
02/19/2021 21:17:54 - INFO - main - target_ids: 0 43890 1026 814 479 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
02/19/2021 21:17:54 - INFO - main - target_mask: 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02/19/2021 21:19:34 - INFO - main - ***** Running training *****
02/19/2021 21:19:34 - INFO - main - Num examples = 241241
02/19/2021 21:19:34 - INFO - main - Batch size = 64
02/19/2021 21:19:34 - INFO - main - Num epoch = 7
0%| | 0/30000 [00:00<?, ?it/s]

@guoday
Copy link
Contributor

guoday commented Feb 20, 2021

It's hard for me to figure out the problem only from log. Can you try to print something in each line here and find which line is blocked :

CodeBERT/code2nl/run.py

Lines 328 to 342 in 3aafd05

batch = next(train_dataloader)
batch = tuple(t.to(device) for t in batch)
source_ids,source_mask,target_ids,target_mask = batch
loss,_,_ = model(source_ids=source_ids,source_mask=source_mask,target_ids=target_ids,target_mask=target_mask)
if args.n_gpu > 1:
loss = loss.mean() # mean() to average on multi-gpu.
if args.gradient_accumulation_steps > 1:
loss = loss / args.gradient_accumulation_steps
tr_loss += loss.item()
train_loss=round(tr_loss*args.gradient_accumulation_steps/(nb_tr_steps+1),4)
bar.set_description("loss {}".format(train_loss))
nb_tr_examples += source_ids.size(0)
nb_tr_steps += 1
loss.backward()

I guess there are some problems in your packages or environment. Have you tried to update your torch and transformers using "pip install --upgrade install torch transformers"?

@akash-isu
Copy link
Author

Updated pytorch and transformers. Still have the same issue.
The execution reaches till before the calculation of the loss. The code executes till line 330, but gets killed while performing the loss calculation in 331.

@guoday
Copy link
Contributor

guoday commented Feb 21, 2021

I am not sure whether your memory is limited. Can you try to use train_batch_size=1. Meanwhile, you can use GPU instead of CPU.

@akash-isu
Copy link
Author

I get this error when I set batch size to 1

Traceback (most recent call last):
File "run.py", line 524, in
main()
File "run.py", line 440, in main
preds = model(source_ids=source_ids,source_mask=source_mask)
File "/home/adutta/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/adutta/Documents/git_repo/CodeBERT/code2nl/model.py", line 89, in forward
tgt_embeddings = self.encoder.embeddings(input_ids).permute([1,0,2]).contiguous()
File "/home/adutta/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/adutta/.local/lib/python3.8/site-packages/transformers/models/roberta/modeling_roberta.py", line 116, in forward
inputs_embeds = self.word_embeddings(input_ids)
File "/home/adutta/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/adutta/.local/lib/python3.8/site-packages/torch/nn/modules/sparse.py", line 124, in forward
return F.embedding(
File "/home/adutta/.local/lib/python3.8/site-packages/torch/nn/functional.py", line 1852, in embedding
return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
RuntimeError: Input, output and indices must be on the current device

@guoday
Copy link
Contributor

guoday commented Feb 21, 2021

For code2nl, the code only support GPU. please remove --no_cuda parameter.

@akash-isu
Copy link
Author

For the code2nl thing, it is an issue with CUDA out of memory. With batch_size=1, it does run.
The other issue with code search fine tuning is still there.

@guoday
Copy link
Contributor

guoday commented Feb 22, 2021

For code search, I guess that the problem is also out of memory about CPU. Do you try to use GPU with batch_size =1 for code search fine tuning?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants