Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TEAMS: Training ELECTRA Augmented with Multi-word Selection #16466

Open
stefan-it opened this issue Mar 28, 2022 · 1 comment
Open

TEAMS: Training ELECTRA Augmented with Multi-word Selection #16466

stefan-it opened this issue Mar 28, 2022 · 1 comment
Assignees

Comments

@stefan-it
Copy link
Collaborator

stefan-it commented Mar 28, 2022

Hi,

this ACL paper from last year (2021) proposed an ELECTRA extension:

Pre-trained text encoders such as BERT and its variants have recently achieved state-of-the-art performances on many NLP tasks. While being effective, these pre-training methods typically demand massive computation resources. To accelerate pre-training, ELECTRA trains a discriminator that predicts whether each input token is replaced by a generator. However, this new task, as a binary classification, is less semantically informative. In this study, we present a new text encoder pre-training method that improves ELECTRA based on multi-task learning. Specifically, we train the discriminator to simultaneously detect replaced tokens and select original tokens from candidate sets. We further develop two techniques to effectively combine all pre-training tasks: (1) using attention-based networks for task-specific heads, and (2) sharing bottom layers of the generator and the discriminator. Extensive experiments on GLUE and SQuAD datasets demonstrate both the effectiveness and the efficiency of our proposed method.

Implementation is available in the TensorFlow models repository: https://github.com/tensorflow/models/tree/master/official/projects/teams

I would like to work on that, to see if it can easily be added (e.g. only writing a model conversion script).

Unfortunately, no model weights do exist at the moment. So I would like to pre-train a model and check if conversion can be done without changing the current ELECTRA implementation too much.

This issue tracks the integration into Transformers 🤗

@stefan-it stefan-it self-assigned this Mar 28, 2022
@stefan-it
Copy link
Collaborator Author

Issue that tracks generation of pre-training data: tensorflow/models#10567

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant