Skip to content

RenzeLou/Pick-Rank

Repository files navigation

This repository contains the code for the paper "Forget Demonstrations, Focus on Learning from Textual Instructions".

We use pointer network to pick up several critical sentences from the task definition, and then utilize an additional training objective (i.e., ranking loss) to train the text-to-text language model.

model

The main system requirements:

  • Python == 3.8.0
  • Pytorch == 1.12.1
  • Transformers == 4.18.0
  • CUDA == 11.3

Environment Setup

Please run the following script to setup the conda environment:

sh setup_env.sh

You can further use conda activate pick_rank to activate the environment.

Data Preparation

We use the Super-NaturalInstructions for the experiments. Please download the dataset by running:

git clone git@github.com:allenai/natural-instructions.git data

Since there is no official development set in the Super-NaturalInstructions, we randomly select 100 tasks from the "excluded" set as the development set, a maximum of 100 instances per task, to tune the hyper-parameters. Please use the following script to process and split the data:

sh setup_data.sh

The data split information can be found in data/splits/add_dev, and the processed data can be found in data/tasks/def_segmentation. You can use the following script to print the data statistics:

python data_statistics.py

Experiments

We use the Hugging Face T5-base for all our experiments and analysis. You can use the following script to train the model:

sh scripts/run.sh

The results can be found in output, including the saved model files, the predictions, and all the intermediate results.

You can also use the following script to quickly read and print the evaluation scores on the test set:

python read_results.py

Citation

Please cite this paper if you use any scores or scripts of this repository:

@article{lou2023forget,
  title={Forget demonstrations, focus on learning from textual instructions},
  author={Lou, Renze and Yin, Wenpeng},
  journal={arXiv preprint arXiv:2308.03795},
  year={2023}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published