Skip to content
/ label_masking Public template

A new method to use multi-head attention heads for sequence labeling

Notifications You must be signed in to change notification settings

pbabvey/label_masking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

label masking using multi-stage multi-head attention

A new method to use multi-head attention heads for sequence labeling This repository contains the notebook of the code used for SEMEVAL 2020 Task 5: Detecting Counterfactuals The code ranked 2nd. It works competetively with some more intricate models like ALBERT and RoBERTa. The innovation of the model is in using multi-stage multi-head attention mehtod on top of BERT. The method can be extended to some other sequence labeling task.

About

A new method to use multi-head attention heads for sequence labeling

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published