-
Notifications
You must be signed in to change notification settings - Fork 0
BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.
pradeepdev-1995/BERT-models-finetuning
ErrorLooks like something went wrong!
About
BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published