Skip to content

Guanyu-Lin/DFAR

Repository files navigation

Dual-interest Factorization-heads Attention for Sequential Recommendation

This is our TensorFlow implementation for the paper:

Dual-interest Factorization-heads Attention for Sequential Recommendation

The code is tested under a Linux desktop with TensorFlow 1.12.3 and Python 3.6.8.

Data Pre-processing

The script is reco_utils/dataset/sequential_reviews.py which can be excuted via:

python examples/00_quick_start/sequential.py --is_preprocessing True

Model Training

To train our model on Amazon dataset (with default hyper-parameters):

python examples/00_quick_start/sequential.py

Misc

The implemention of self attention is modified based on TensorFlow framework of Microsoft.

About

No description, website, or topics provided.

Resources

License

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages