Skip to content

pierre-zhao/FCA-BERT

Repository files navigation

FCA-BERT

This repo provides the code for reproducing the experiments in ACL-2022 paper: Fine- and Coarse-Granularity Hybrid Self-Attention for Efficient BERT. This code is adapted from the repos of PoWER-BERT.

Environment

tensorflow-gpu==1.15.0
keras==2.3.0
keras_bert==0.60.0

Dataset

Before running this Repo you should download the GLUE data and then use this script to unpack it to some directory $GLUE_DIR. Also, download the tf-version pre-trained checkpoint(BERT-base/large, ELECTRA-base/large, Distil-BERT etc.) and unzip it to some directory $BERT_DIR.

Running

The detailed training and inference steps including the parameters are given in the run.sh.

Citation

@inproceedings{zhao-etal-FCA,
    title = "Fine- and Coarse-Granularity Hybrid Self-Attention for Efficient BERT",
    author = "Zhao, Jing  and
      Wang, yifan  and
      Bao, Junwei  and
      Wu, Youzheng  and
      He, Xiaodong",
    booktitle = "ACL  2022",
    year = "2022",
    publisher = "Association for Computational Linguistics",
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published