Skip to content

yanyongyu/AF-Adapter

Repository files navigation

AF-Adapter (Attention-FFN Adapter): Enhanced Continual Pretraining

Introduction

This repository contains the code for the paper AF-Adapter: Enhanced Continual Pretraining for Building Chinese Biomedical Language Model.

Installation

pip install af-adapter

Usage

TODO

See bert for more model architecture details.

See scipts and examples directory for more training details.

Citation

@INPROCEEDINGS{10385733,
  author={Yan, Yongyu and Xue, Kui and Shi, Xiaoming and Ye, Qi and Liu, Jingping and Ruan, Tong},
  booktitle={2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)},
  title={AF Adapter: Continual Pretraining for Building Chinese Biomedical Language Model},
  year={2023},
  pages={953-957},
  keywords={Adaptation models;Head;Biological system modeling;Buildings;Natural languages;Stability analysis;Task analysis;Continual pretraining;Chinese biomedical natural language processing;Adapter tuning},
  doi={10.1109/BIBM58861.2023.10385733}
}

About

AF-Adapter (Attention-FFN Adapter): Enhanced Continual Pretraining

Resources

Stars

Watchers

Forks

Packages

No packages published