Skip to content

πŸ”¬ This is the source code and baselines of our SDM'24 paper FedDCSR: Federated Cross-domain Sequential Recommendation via Disentangled Representation Learning.

License

orion-orion/FedDCSR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

10 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

FedDCSR: Federated Cross-domain Sequential Recommendation via Disentangled Representation Learning

Hongyu Zhang, Dongyi Zheng, Xu Yang, Jiyuan Feng, Qing Liao*

Open Source LoveLICENSEFedDCSR
FedDCSR FedDCSR

1 Introduction

This is the source code and baselines of our SDM'24 paper FedDCSR: Federated Cross-domain Sequential Recommendation via Disentangled Representation Learning. In this paper, we propose FedDCSR, a novel federated cross-domain sequential recommendation framework via disentangled representation learning.

2 Dependencies

Run the following command to install dependencies:

pip install -r requirements.txt

Note that my Python version is 3.8.13. In addition, it is especially important to note that the Pytorch version needs to be <=1.7.1, otherwise the autograd engine of Pytorch will report an error.

3 Dataset

As used in many cross-domain recommendation methods, we utilize the publicly available datasets from Amazon (an e-commerce platform) to construct the federated CSR scenarios. We select ten domains to generate three cross-domain scenarios: Food-Kitchen-Cloth-Beauty (FKCB), Movie-Book-Game (MBG), and Sports-Garden-Home (SGH).

The preprocessed CSR datasets can be downloaded from Google Drive. You can download them and place them in the ./data path of this project.

4 Code Structure

FedDCSR
β”œβ”€β”€ LICENSE                                     LICENSE file
β”œβ”€β”€ README.md                                   README file 
β”œβ”€β”€ checkpoint                                  Model checkpoints saving directory
β”‚   └── ...
β”œβ”€β”€ data                                        Data directory
β”‚   └── ...
β”œβ”€β”€ log                                         Log directory
β”‚   └── ...
β”œβ”€β”€ models                                      Local model packages
β”‚   β”œβ”€β”€ __init__.py                             Package initialization file
β”‚   β”œβ”€β”€ cl4srec                                 CL4SRec package
β”‚   β”‚   β”œβ”€β”€ __init__.py                         Package initialization
β”‚   β”‚   β”œβ”€β”€ cl4srec_model.py                    Model architecture
β”‚   β”‚   β”œβ”€β”€ config.py                           Model configuration file
β”‚   β”‚   └── modules.py                          Backbone modules (such as self-attention)
β”‚   └── ...
β”œβ”€β”€ pic                                         Picture directory
β”‚   └── FedDCSR-Framework.png                   Model framework diagram
β”œβ”€β”€  utils                                      Tools such as data reading, IO functions, training strategies, etc.
β”‚    β”œβ”€β”€ __init__.py                            Package initialization file
β”‚    β”œβ”€β”€ data_utils.py                          Data reading
β”‚    β”œβ”€β”€ io_utils.py                            IO functions
β”‚    └── train_utils.py                         Training strategies
β”œβ”€β”€ client.py                                   Client architecture   
β”œβ”€β”€ dataloader.py                               Customized dataloader
β”œβ”€β”€ dataset.py                                  Customized dataset          
β”œβ”€β”€ fl.py                                       The overall process of federated learning
β”œβ”€β”€ local_graph.py                              Local graph data structure
β”œβ”€β”€ losses.py                                   Loss functions
β”œβ”€β”€ main.py                                     Main function, including the complete data pipeline
β”œβ”€β”€ requirements.txt                            Dependencies installation
β”œβ”€β”€ server.py                                   Server-side model parameters and user representations aggregation
β”œβ”€β”€ trainer.py                                  Training and test methods of FedDCSR and other baselines
└── .gitignore                                  .gitignore file

5 Train & Eval

5.1 Our method

To train FedDCSR (ours), you can run the following command:

python -u main.py \
        --epochs 40 \
        --local_epoch 3 \
        --eval_interval 1 \
        --frac 1.0 \
        --batch_size 256 \
        --log_dir log \
        --method FedDCSR \
        --anneal_cap 1.0 \
        --lr 0.001 \
        --seed 42 \
        Food Kitchen Clothing Beauty

There are a few points to note:

  • the positional arguments Food Kitchen Clothing Beauty indicates training FedDCSR in FKCB scenario. If you want to choose another scenario, you can change it to Move Book Game (MBG) or Sports Garden Home (SGH).

  • The argument --anneal_cap is used to control KL annealing for variantional method (including ours). For FKCB, 1.0 is the best; for MBG and SGH, 0.01 is the best.

  • If you restart training the model in a certain scenario, you can add the parameter --load_prep to load the dataset preprocessed in the previous training to avoid repeated data preprocessing

To test FedDCSR, you can run the following command:

python -u main.py \
        --log_dir log \
        --method FedDCSR \
        --do_eval \
        --seed 42 \
        Food Kitchen Clothing Beauty

5.2 Baselines

To train other baselines (FedSASRec, FedVSAN, FedContrastVAE, FedCL4SRec, FedDuoRec), you can run the following command:

python -u main.py \
        --epochs 40 \
        --local_epoch 3 \
        --eval_interval 1 \
        --frac 1.0 \
        --batch_size 256 \
        --log_dir log \
        --method FedContrastVAE \
        --anneal_cap 1.0 \
        --lr 0.001 \
        --seed 42 \
        Food Kitchen Clothing Beauty

For the local version without federated aggregation, you can run the following command:

python -u main.py \
        --epochs 40 \
        --local_epoch 3 \
        --eval_interval 1 \
        --frac 1.0 \
        --batch_size 256 \
        --log_dir log \
        --method LocalContrastVAE \
        --anneal_cap 1.0 \
        --lr 0.001 \
        --seed 42 \
        Food Kitchen Clothing Beauty

6 Citation

If you find this work useful for your research, please kindly cite FedDCSR by:

@misc{zhang2023feddcsr,
      title={FedDCSR: Federated Cross-domain Sequential Recommendation via Disentangled Representation Learning}, 
      author={Hongyu Zhang and Dongyi Zheng and Xu Yang and Jiyuan Feng and Qing Liao},
      year={2023},
      eprint={2309.08420},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

About

πŸ”¬ This is the source code and baselines of our SDM'24 paper FedDCSR: Federated Cross-domain Sequential Recommendation via Disentangled Representation Learning.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages