You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
this ACL paper from last year (2021) proposed an ELECTRA extension:
Pre-trained text encoders such as BERT and its variants have recently achieved state-of-the-art performances on many NLP tasks. While being effective, these pre-training methods typically demand massive computation resources. To accelerate pre-training, ELECTRA trains a discriminator that predicts whether each input token is replaced by a generator. However, this new task, as a binary classification, is less semantically informative. In this study, we present a new text encoder pre-training method that improves ELECTRA based on multi-task learning. Specifically, we train the discriminator to simultaneously detect replaced tokens and select original tokens from candidate sets. We further develop two techniques to effectively combine all pre-training tasks: (1) using attention-based networks for task-specific heads, and (2) sharing bottom layers of the generator and the discriminator. Extensive experiments on GLUE and SQuAD datasets demonstrate both the effectiveness and the efficiency of our proposed method.
I would like to work on that, to see if it can easily be added (e.g. only writing a model conversion script).
Unfortunately, no model weights do exist at the moment. So I would like to pre-train a model and check if conversion can be done without changing the current ELECTRA implementation too much.
This issue tracks the integration into Transformers 🤗
The text was updated successfully, but these errors were encountered:
Hi,
this ACL paper from last year (2021) proposed an ELECTRA extension:
Implementation is available in the TensorFlow models repository: https://github.com/tensorflow/models/tree/master/official/projects/teams
I would like to work on that, to see if it can easily be added (e.g. only writing a model conversion script).
Unfortunately, no model weights do exist at the moment. So I would like to pre-train a model and check if conversion can be done without changing the current ELECTRA implementation too much.
This issue tracks the integration into Transformers 🤗
The text was updated successfully, but these errors were encountered: