SASRec: Self-Attentive Sequential Recommendation
Switch branches/tags
Nothing to show
Clone or download
Wang-Cheng Kang Wang-Cheng Kang
Wang-Cheng Kang and Wang-Cheng Kang Change figure size
Latest commit e73746e Oct 15, 2018
Permalink
Failed to load latest commit information.
data initial commit Oct 15, 2018
README.md Change figure size Oct 15, 2018
curve.png initial commit Oct 15, 2018
main.py initial commit Oct 15, 2018
model.py initial commit Oct 15, 2018
modules.py initial commit Oct 15, 2018
sampler.py initial commit Oct 15, 2018
util.py initial commit Oct 15, 2018

README.md

SASRec: Self-Attentive Sequential Recommendation

This is our TensorFlow implementation for the paper:

Wang-Cheng Kang, Julian McAuley. Self-Attentive Sequential Recommendation. In Proceedings of IEEE International Conference on Data Mining (ICDM'18)

Please cite our paper if you use the code or datasets.

The code is tested under a Linux desktop (w/ GTX 1080 Ti GPU) with TensorFlow 1.2.

Datasets

The preprocessed datasets are included in the repo (e.g. data/Video.txt), where each line contains an user id and item id (starting from 1) meaning an interaction (sorted by timestamp).

The data pre-processing script is also included. For example, you could download Amazon review data from here., and run the script to produce the txt format data.

Model Training

To train our model on Video (with default hyper-parameters):

python main.py --dataset=Video --train_dir=default 

or on ml-1m:

python main.py --dataset=ml-1m --train_dir=default --maxlen=200 --dropout_rate=0.2 

Misc

The implemention of self attention is modified based on this

The convergence curve on ml-1m, compared with CNN/RNN based approaches: