Skip to content

the Pytorch implementation for our EMNLP 2021 paper "Learning Neural Templates for Recommender Dialogue System"

License

Notifications You must be signed in to change notification settings

jokieleung/NTRD

Repository files navigation

NTRD

This repository is the Pytorch implementation of our paper "Learning Neural Templates for Recommender Dialogue System" in EMNLP 2021.

In this paper, we introduce NTRD, a novel recommender dialogue system (i.e., conversational recommendation system) framework that decouples the dialogue generation from the item recommendation via a two-stage strategy. Our approach makes the recommender dialogue system more flexible and controllable. Extensive experiments show our approach significantly outperforms the previous state-of-the-art methods.

The code is still being organized, feel free to contact me if you encounter any problems.

Dependencies

pytorch==1.6.0
gensim==3.8.3
torch_geometric==1.6.3
torch-cluster==1.5.8
torch-scatter==2.0.5
torch-sparse==0.6.8
torch-spline-conv==1.2.0

the required data word2vec_redial.npy can be produced by the function dataset.prepare_word2vec().

Run

Run the script below to pre-train the recommender module. It would converge after 3 epochs pre-training and 3 epochs fine-tuning.

python run.py

Then, run the following script to train the seq2seq dialogue task. Transformer model is difficult to coverge, so the model need many of epochs to covergence. Please be patient to train this model.

python run.py --is_finetune True

The model will report the result on test data automatically after covergence.

To run the novel experiments, you need to generate the data/full_data.jsonl first by combining the data/train_data.jsonl and data/test_data.jsonl into one file.

Also, you need to uncomment the code in dataset.py L117 and L317 - L 322.

Then, run the following script to pretrained the recommender module.

python run_novel.py

and the following step is the same as the conventional setting by runing the command below.

python run_novel.py --is_finetune True

Citation

If you find this codebase helps your research, please kindly consider citing our paper in your publications.

@inproceedings{liang2021learning,
  title={Learning Neural Templates for Recommender Dialogue System},
  author={Liang, Zujie and 
          Hu, Huang and 
          Xu, Can and 
          Miao, Jian and 
          He, Yingying and 
          Chen, Yining and 
          Geng, Xiubo and 
          Liang, Fan and 
          Jiang, Daxin},
  booktitle={Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
  year={2021}
}

Acknowledgment

This codebase is implemented based on KGSF. Many thanks to the authors for their open-source project.

About

the Pytorch implementation for our EMNLP 2021 paper "Learning Neural Templates for Recommender Dialogue System"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published