Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model Hyper Parameters to change after pretraining on the custom dataset #580

Closed
aswin-giridhar opened this issue Apr 15, 2019 · 1 comment

Comments

@aswin-giridhar
Copy link

aswin-giridhar commented Apr 15, 2019

I had run the pretraining code of bert on a custom dataset and now i want to know which arguments i should change based on the pretrained model. The only arguments which I have changed among the three arguments(vocab_file,config_file,init_checkpoint) is the init_checkpoint which I have given the latest checkpoint created by the pretraining code. But when I tried to run it I was getting the following error.
image.

So i tried changing the vocab_size in bert_config.json and tried to run it. This is the error which I am getting now.
image

Could you tell me the reason why am getting this issue?

@aswin-giridhar
Copy link
Author

Working after i reduce batch_size. for further info refer issue no #92 and #82 for further info

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant