A new method to use multi-head attention heads for sequence labeling This repository contains the notebook of the code used for SEMEVAL 2020 Task 5: Detecting Counterfactuals The code ranked 2nd. It works competetively with some more intricate models like ALBERT and RoBERTa. The innovation of the model is in using multi-stage multi-head attention mehtod on top of BERT. The method can be extended to some other sequence labeling task.
-
Notifications
You must be signed in to change notification settings - Fork 0
pbabvey/label_masking
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
A new method to use multi-head attention heads for sequence labeling
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published