Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to train BERT? #3

Open
codertimo opened this issue Oct 17, 2018 · 7 comments
Open

Is it possible to train BERT? #3

codertimo opened this issue Oct 17, 2018 · 7 comments
Assignees
Labels
help wanted Extra attention is needed question Further information is requested

Comments

@codertimo
Copy link
Owner

codertimo commented Oct 17, 2018

Is it possible to achieve the same result as the paper in short time?
Well.. I don't have enough GPU & computation power to see the enough result as google ai.

If we can't train the full corpus as the google, then how can we prove that this code is verified?
Training 256M size corpus without Google AI class gpu computation is nearly, impossible for me.

If you have any thought(reducing the model size) please let me know!

@codertimo codertimo added help wanted Extra attention is needed question Further information is requested labels Oct 17, 2018
@codertimo codertimo self-assigned this Oct 17, 2018
@briandw
Copy link

briandw commented Oct 22, 2018

The authors plan on releasing the full pre-trained model in a few weeks. There will be the task of loading their model weights into PyTorch. Perhaps ONNX will work for getting the weights out of TF and into PT?

Once the weights have been loaded, it should be possible to validate the finetuneing results.

@codertimo
Copy link
Owner Author

@briandw Well I sent the email to author, and they noticed me the same thing.
Well I agree that we can generate the pytorch module using ONNX, but it might be impossible to load weight on this model as same as tf model architecture. So do you have any idea about this?

@codertimo codertimo changed the title architecture & training process verification with fully training Is it possible to train BERT? Oct 23, 2018
@briandw
Copy link

briandw commented Oct 23, 2018

I can try to import the Tensor2tensor model into PT. https://github.com/tensorflow/tensor2tensor
It should be the same process.

@briandw
Copy link

briandw commented Oct 23, 2018

@codertimo Should the goal be to train BERT from scratch or to fine-tune the model? I'd say that scratch training isn't realistic right now. Fine-tuneing shouldn't be that resource intense and will be very valuable.

@codertimo
Copy link
Owner Author

@briandw Thank you for your advice. Currently my goal is training from the scratch with smaller model which can available to train on our GPU environment. Cause I wanna keep this implementation for someone need training on there specific domain or language.

But as you said, moving trained model on tf to pytorch is another goal of this project too. So I liked to implement the transfer code for loading pretrained model too. Well I'll make a plan and notice you guys when the pretrained model and official BERT implementation is came out.

@jacobrxz
Copy link

Is this code support distributed training? I mean multi-computer with multi-gpu...

@BerenLuthien
Copy link

@codertimo Did you already trained this model on small dataset ? If yes, would you share some info about it ? For example, what if we use p2.8xlarge GPUs to train on 1M dataset from scratch (Thanks for wonderful work BTW)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants