Skip to content

This repository contains the source code for our paper Bidirectional Transformer Reranker for Grammatical Error Correction.

Notifications You must be signed in to change notification settings

zhangying9128/BTR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains the source code for our paper Bidirectional Transformer Reranker for Grammatical Error Correction.

Getting Started

Requirements

  • PyTorch version == 1.10.1
  • Python version >= 3.7

Clone this repository

git clone https://github.com/zhangying9128/BTR.git

Install transformers (Huggingface) from our repository

Please use our modified transformers.

cd transformers
pip install --editable .
cd ..

Install fairseq from our repository

Please use our modified fairseq.

cd fairseq/
pip install --editable .
cd ..

Pre-training the BTR

To reproduce our experimental results, you can follow our steps to pre-train it by yourselves. Or directly downloading our pre-trained BTR to the corresponding folders in checkpoints.

Download Realnewslike dataset

As we mentioned in our paper, we utilized the Realnewslike dataset for pretraining the BTR, which is a subset of the C4 dataset. Please use the following command to download and preprocess the Realnewslike dataset.

bash commands/preprocess_realnewslike.sh

Or, you can directly download our provided processed data-bin realnewslike-t5tokenizer.

Pre-training the BTR with Realnewslike dataset

Before pre-training, please download our fine-tuned T5GEC to the corresponding folders in checkpoints to speed-up pre-training. And then use the following command to pre-train the BTR. You need to modify commands CUDA_VISIBLE_DEVICES and --distributed-world-size according to your environment. Batch size = MAX_TOKENS * UPDATE_FREQ * distributed_world_size. Our experiment here used two gpus, therefore we set --distributed-world-size=2.

SAVE_PATH=checkpoints/pre-trained-BTR/
bash commands/pretrain_BTR.sh

Fine-tuning the BTR

After getting pre-trained BTR, you can use the following steps to fine-tune it by yourselves. Or directly download our fine-tuned BTR to the corresponding folders in checkpoints. As we mentioned in our paper, we run 4 trials with random seeds. You can use the following fined-tuned BTR models to reproduce our results.

Trial 1 Trial 2 Trial 3 Trial 4
1 2 3 4

Download dataset

Please follow our suggestions to download grammatical error correction datasets, put their corresponding .src and .tgt files to the corresponding folders in datasets, and follow our steps to process these datasets. Or you can download our processed data-bin clang8-conll13-conll14-t5tokenizer for fine-tuning the BTR. Specifically, as we mentioned in our paper, we also utilized the cleaned version of CoNLL-13 and CoNLL-14 dataset, please check the corresponding m2 files for more details.

Process CoNLL-13 dataset

Here we give an example of processing CoNLL-13 dataset and constructing a data bin from processed cLang8, CoNLL-13, and CoNLL-14 datasets. Please use the following commads to process data for using fairseq.

bash commands/preprocess_gec_datasets.sh

Process candidates for CoNLL-13 dataset

You can use your own candidate files or use our provided candidates (generated by the T5GEC) that used in our paper, please check the folders in data and this link. Here we give an example of processing candidates for CoNLL-13 dataset. Please use the following commads to do tokenization for candidates.

bash commands/process_candidates.sh

Fine-tuning the BTR with cLang8 dataset

Before finetuning, please copy the pre-trained BTR to the SAVE_PATH for loading pre-trained parameters. And then use the following command to fine-tune the BTR.

SAVE_PATH=checkpoints/fine-tuned-BTR/
bash commands/finetune_BTR.sh

Reranking candidates with the BTR

Please use the following command to rerank candidates with the BTR. This command will help to generate and save the corresponding score (formula 9 in our paper) for each candidate.

bash commands/rerank.sh

Citation:

Please cite as:

@inproceedings{zhang-etal-2023-bidirectional,
    title = "Bidirectional Transformer Reranker for Grammatical Error Correction",
    author = "Zhang, Ying  and
      Kamigaito, Hidetaka  and
      Okumura, Manabu",
    booktitle = "Findings of the Association for Computational Linguistics: ACL 2023",
    month = jul,
    year = "2023",
    address = "Toronto, Canada",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.findings-acl.234",
    pages = "3801--3825",
    abstract = "Pre-trained seq2seq models have achieved state-of-the-art results in the grammatical error correction task. However, these models still suffer from a prediction bias due to their unidirectional decoding. Thus, we propose a bidirectional Transformer reranker (BTR), that re-estimates the probability of each candidate sentence generated by the pre-trained seq2seq model. The BTR preserves the seq2seq-style Transformer architecture but utilizes a BERT-style self-attention mechanism in the decoder to compute the probability of each target token by using masked language modeling to capture bidirectional representations from the target context. For guiding the reranking, the BTR adopts negative sampling in the objective function to minimize the unlikelihood. During inference, the BTR gives final results after comparing the reranked top-1 results with the original ones by an acceptance threshold. Experimental results show that, in reranking candidates from a pre-trained seq2seq model, T5-base, the BTR on top of T5-base could yield 65.47 and 71.27 F0.5 scores on the CoNLL-14 and BEA test sets, respectively, and yield 59.52 GLEU score on the JFLEG corpus, with improvements of 0.36, 0.76 and 0.48 points compared with the original T5-base. Furthermore, when reranking candidates from T5-large, the BTR on top of T5-base improved the original T5-large by 0.26 points on the BEA test set.",
}

About

This repository contains the source code for our paper Bidirectional Transformer Reranker for Grammatical Error Correction.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages