The official implementation for the paper CEM: Commonsense-aware Empathetic Response Generation.
Install the required libraries (Python 3.8.5 | CUDA 10.2)
pip install -r requirements.txt
Download Pretrained GloVe Embeddings and save it in /vectors
.
The preprocessed dataset is already provided as /data/ED/dataset_preproc
. However, if you want to create the dataset yourself, delete this file, download the COMET checkpoint and place it in /data/ED/Comet
. The preprocessed dataset would be generated after the training script.
python main.py --model [model_name] [--woDiv] [--woEMO] [--woCOG] [--cuda]
where model_name could be one of the following: trs | multi-trs | moel | mime | empdg | cem. In addition, the extra flags can be used for ablation studies.
For reproducibility, download the trained checkpoint, put it in a folder named saved
and run the following:
python main.py --model cem --test --model_path save/CEM_19999_41.8034 [--cuda]
Create a folder results
and move the obtained results.txt for each model to this folder. Rename the files to the name of the model and run the following:
python src/scripts/evaluate.py
If you find our work useful for your research, please kindly cite our paper as follows:
@article{CEM2021,
title={CEM: Commonsense-aware Empathetic Response Generation},
author={Sahand Sabour, Chujie Zheng, Minlie Huang},
journal={arXiv preprint arXiv:2109.05739},
year={2021},
}