Skip to content
#

roberta

Here are 182 public repositories matching this topic...

BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.

  • Updated Aug 10, 2020
  • Python

Improve this page

Add a description, image, and links to the roberta topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the roberta topic, visit your repo's landing page and select "manage topics."

Learn more