Skip to content

znhy1024/ProToCo

Repository files navigation

Prompt to be Consistent is Better than Self-Consistent? Few-Shot and Zero-Shot Fact Verification with Pre-trained Language Models (Accepted as Findings of ACL 2023)

This is the official repository of Few-Shot and Zero-Shot Fact Verification model ProToCo.

  • ProToCo is a novel prompt-based consistency training method for improving PLMs on few-shot and zero-shot fact verification by explicitly imposing a general factuality-grounded consistency scheme on PLMs.

Requirements

Our code is developed based on the T-Few codebase. Plese refer to the T-Few repo for setup instruction.

The format of input data

Please ensure that the train and test data files are in JSONL format, with the following fields for each line::

{"id": instance id, "gold_evidence_text": gold evidence text, "claim":claim text, "label": label}

You may also download the processed files from this link.

Training & Evaluation

To train ProToCo with default hyperparameters, run the following command for few-shot setting:

sh train_fs.sh

run the following command for zero-shot setting:

sh train_zs.sh

After training, the script will automatically output the test results. You can also customize the hyperparameters and data directory in the default.json file to train your custom dataset with specific hyperparameters. If you set the eval_before_training parameter in default.json to true and num_steps to 0, you can use the same command to test the trained model without training.

Citation

If you use this code in your research, please cite our paper.

@inproceedings{zeng-gao-2023-prompt,
    title = "Prompt to be Consistent is Better than Self-Consistent? Few-Shot and Zero-Shot Fact Verification with Pre-trained Language Models",
    author = "Zeng, Fengzhu  and
      Gao, Wei",
    booktitle = "Findings of the Association for Computational Linguistics: ACL 2023",
    month = jul,
    year = "2023",
    address = "Toronto, Canada",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.findings-acl.278",
    pages = "4555--4569"
}

Contact for issues

References & Open sources

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published