Skip to content

HeoTaksung/Document-to-Sequence-BERT-using-Sequence-Attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Document-to-Sequence-BERT

Multi-Label Classification


Dataset

  • MIMIC-III

    • Text - noteevents

    • Label - diagnoses_icd, procedure_icd


Model Structure

  • Document to Sequence Preprocessing

  • Collecting the CLS_token extracted through Document-to-Sequence BERT (D2SBERT)

  • Sequence Attention

  • Classifier


Result

Model F1-Macro F1-Micro
CAML 0.56924 0.64993
SWAM 0.58025 0.65994
EnCAML 0.59653 0.66594
BERT-head 0.49376 0.56627
BERT-tail 0.45453 0.54011
BERT-head-tail 0.49362 0.56566
Proposed model 0.62898 0.68555

Releases

No releases published

Packages

No packages published

Languages