You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. It attempts to distill information from a trained BERT model to an LSTM model
2. It also suggests an augmentation method for working with small datasets
The text was updated successfully, but these errors were encountered:
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
https://arxiv.org/pdf/1903.12136.pdf
Description
The text was updated successfully, but these errors were encountered: