-
Notifications
You must be signed in to change notification settings - Fork 25.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BERT tokenizer - set special tokens #599
Comments
Hi Adrian, BERT already has a few unused tokens that can be used similarly to the |
In case we use an unused special token from the vocabulary, is it enough to finetune a classification task or do we need to train an embedding from scratch? Did anyone already do this? Two different and somehow related questions I had when looking into the implementation:
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
* Update transformers.js version * Update Token.jsx
Hi,
I was wondering whether the team could expand BERT so that fine-tuning with newly defined special tokens would be possible - just like the GPT allows.
@thomwolf Could you share your thought with me on that?
Regards,
Adrian.
The text was updated successfully, but these errors were encountered: