Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initializing BertMultiTask model error #25

Closed
nigaregr opened this issue Sep 9, 2019 · 2 comments
Closed

Initializing BertMultiTask model error #25

nigaregr opened this issue Sep 9, 2019 · 2 comments

Comments

@nigaregr
Copy link

nigaregr commented Sep 9, 2019

Hi, I am getting the following error when I run Bert pretraining:
09/09/2019 10:13:43 - INFO - logger - Vocabulary contains 30522 tokens
09/09/2019 10:13:43 - INFO - logger - Initializing BertMultiTask model
Traceback (most recent call last):
File "AzureML-BERT/pretrain/PyTorch/train_nitin.py", line 361, in
summary_writer = summary_writer)
File "/home/nigaregr/Documents/AzureML-BERT/pretrain/PyTorch/models.py", line 121, in init
self.network.register_batch(BatchType.PRETRAIN_BATCH, "pretrain_dataset", loss_calculation=BertPretrainingLoss(self.bert_encoder, bert_config))
File "/home/nigaregr/Documents/AzureML-BERT/pretrain/PyTorch/models.py", line 25, in init
self.cls = BertPreTrainingHeads(config, self.bert.embeddings.word_embeddings.weight)
TypeError: init() takes 2 positional arguments but 3 were given

@skaarthik
Copy link
Contributor

Are you using pytorch-pretrained-bert package v0.6.2?

@nigaregr
Copy link
Author

nigaregr commented Sep 9, 2019

Yes. That fixes it. Thanks

@nigaregr nigaregr closed this as completed Sep 9, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants