Skip to content

Veason-silverbullet/UniTRec

Repository files navigation

UniTRec

This repository releases the code of paper UniTRec: A Unified Text-to-Text Transformer and Joint Contrastive Learning Framework for Text-based Recommendation (ACL-2023 Short Paper).

Dataset Preparation

Our code will download and pre-tokenize the datasets automatically. Also refer to setup.sh.

cd textRec_datasets
python newsrec_tokenize.py
python quoterec_tokenize.py
python engagerec_tokenize.py

UniTRec Training

Suppose that two GPUs are available for training.

CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 newsrec.py
CUDA_VISIBLE_DEVICES=2,3 python -m torch.distributed.launch --nproc_per_node=2 quoterec.py
CUDA_VISIBLE_DEVICES=4,5 python -m torch.distributed.launch --nproc_per_node=2 engagerec.py

Note

The transformer codebase is adapted from Huggingface Transoformers. The UniTRec Model is written at transformers/models/UniTRec/modeling_unitrec.py.

TODO

The codes are now using two GPUs for training and one for inference. Acceleration can be achieved by distributed inference.

Citation

@inproceedings{mao-etal-2023-unitrec,
    title = "UniTRec: A Unified Text-to-Text Transformer and Joint Contrastive Learning Framework for Text-based Recommendation",
    author = "Mao, Zhiming  and
              Wang, Huimin  and
              Du, Yiming  and
              Wong, Kam-Fai",
    booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)",
    month = jul,
    year = "2023",
    address = "Toronto, Canada",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.acl-short.100",
    pages = "1160--1170"
}

About

UniTRec: A Unified Text-to-Text Transformer and Joint Contrastive Learning Framework for Text-based Recommendation

Resources

License

Stars

Watchers

Forks

Languages