New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GH-1041: add support for DistilBERT #1044
Conversation
flair/embeddings.py
Outdated
@@ -16,6 +16,8 @@ | |||
from pytorch_transformers import ( | |||
BertTokenizer, | |||
BertModel, | |||
DistilBertTokenizer, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps move these declarations to the if bert_model_or_path.startswith("distilbert"):
part of the BertEmbeddings?
This way, it won't trigger an error in Travis when initializing the embeddings classes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks :) I changed the import logic. Now an error message is thrown whenever DistilBert*
is not found!
e10e008
to
b7ffc28
Compare
Cool, thanks for adding this! |
👍 |
1 similar comment
👍 |
@stefan-it Hi ,i wanted to try this out, can i just do a
|
Yes, checkout to |
When trying to run this i get I have already installed pytorch-transformers and have pulled the latest master code. 2019-08-30 18:32:04,354 ATTENTION! To use DistilBert, please first install a recent version of pytorch-transformers! |
Try: $ git clone https://github.com/huggingface/pytorch-transformers.git
$ cd pytorch-transformers
$ pip install -e . I used that during development :) |
Okay let me check
On 30-Aug-2019, at 6:59 PM, Stefan Schweter <notifications@github.com<mailto:notifications@github.com>> wrote:
Try:
$ git clone https://github.com/huggingface/pytorch-transformers.git
$ cd pytorch-transformers
$ pip install -e .
I used that during development :)
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub<#1044>, or mute the thread<https://github.com/notifications/unsubscribe-auth/AGD5QFYLHLODM73AVQFF3Q3QHEODBANCNFSM4ISBWN7Q>.
|
works like a charm , Thanks. |
Hi,
this PR adds support for
DistilBERT
, a distilled version of BERT.The following example shows how to use it in Flair:
Please also refer to this great blog post about distillation 🤗 🤗
I made some experiments on the full CoNLL-2003 corpus.
DistilBERT
(with this PR-ready version) is only -0.35% behind the BERT (base, uncased) model.Notice: to try
DistilBERT
please make sure that you've installed the latestmaster
version ofpytorch-transformers
(currently, no released version withDistilBERT
exists).