Skip to content

amazon-science/intent-aware-encoder

🥧 Pre-Training Intent-Aware Encoders

This repository is used for Pre-training Intent-Aware Encoders (PIE) and evaluating on four intent classification datasets (BANKING77, HWU64, Liu54, and CLINC150).

Environment setup

Option 1: Docker

image_name=pie
code_path=/path/to/intent-aware-encoder
docker build  -t $image_name .
nvidia-docker run -it -v ${code_path}:/code $image_name
cd code

Option 2: Conda

conda create -n pie python=3.8
conda activate pie
pip install -r requirements.txt
python -m spacy download en_core_web_md

Pre-training

See the readme in the pretraining directory.

cd pretraining

Fine-tuning and Evaluation

See the readme in the downstream directory.

cd downstream

Acknowledgement

Parts of the code are modified from mirror-bert, IDML, and ProtAugment. We appreciate the authors for open sourcing their projects.

Security

See CONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.

Please cite the following paper if using the code or data from this project in your work:

@misc{sung2023pretraining,
      title={Pre-training Intent-Aware Encoders for Zero- and Few-Shot Intent Classification}, 
      author={Mujeen Sung and James Gung and Elman Mansimov and Nikolaos Pappas and Raphael Shu and Salvatore Romeo and Yi Zhang and Vittorio Castelli},
      year={2023},
      eprint={2305.14827},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

About

Pre-training Intent-Aware Encoders for Zero- and Few-Shot Intent Classification

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks