Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

|Installation | Quick Start | Documentation | Contributing | Getting help | Citation|

Accord Project Logo

PyPI version Code Coverage CI Documentation Status GitHub Slack Status

Beta-RecSys an open source project for Building, Evaluating and Tuning Automated Recommender Systems. Beta-RecSys aims to provide a practical data toolkit for building end-to-end recommendation systems in a standardized way. It provided means for dataset preparation and splitting using common strategies, a generalized model engine for implementing recommender models using Pytorch with a lot of models available out-of-the-box, as well as a unified training, validation, tuning and testing pipeline. Furthermore, Beta-RecSys is designed to be both modular and extensible, enabling new models to be quickly added to the framework. It is deployable in a wide range of environments via pre-built docker containers and supports distributed parameter tuning using Ray.



If you use conda, you can install it with:

conda install beta-rec


If you use pip, you can install it with:

pip install beta-rec


We also provide docker image for you to run this project on any platform. You can use the image with:

  1. Pull image from Docker Hub

    docker pull betarecsys/beta-recsys:latest
  2. Start a docker container with this image (Make sure the port 8888 is available on you local machine, or you can change the port in the command)

    docker run -ti --name beta-recsys -p 8888:8888 -d beta-recsys
  3. Open Jupyter on a brower with this URL:

  4. Enter root as the password for the notebook.

Quick Start

Downloading and Splitting Datasets

from beta_rec.datasets.movielens import Movielens_100k
from import BaseData
dataset = Movielens_100k()
split_dataset = dataset.load_leave_one_out(n_test=1)
data =  BaseData(split_dataset)

Training model with MatrixFactorization

config = {
from beta_rec.recommenders import MatrixFactorization
model = MatrixFactorization(config)
result = model.test(data.test[0])

where a default config josn file ./configs/mf_default.json will be loaded for traning the model.

Tuning Model Hyper-parameters

config = {
tune_result = model.train(data)

Experiment with multiple models

from beta_rec.recommenders import MatrixFactorization
from beta_rec.experiment.experiment import Experiment

# Initialise recommenders with their default configuration file

config = {

mf_1 = MatrixFactorization(config)
mf_2 = MatrixFactorization(config)

# Run experiments of the recommenders on the selected dataset

  models=[mf_1, mf_2],

where the model will tune the hyper-parameters according to the specifed tuning scheme (e.g. the default for MF).


The following is a list of recommender models currently available in the repository, or to be implemented soon.

General Models

Model Paper Colab
MF Neural Collaborative Filtering vs. Matrix Factorization Revisited, arXiv’ 2020 Example In Colab
GMF Generalized Matrix Factorization, in Neural Collaborative Filtering, WWW 2017
MLP Multi-Layer Perceptron, in Neural Collaborative Filtering, WWW 2017
NCF Neural Collaborative Filtering, WWW 2017 Example In Colab
CMN Collaborative memory network for recommendation systems, SIGIR 2018
NGCF Neural graph collaborative filtering, SIGIR 2019 Example In Colab
LightGCN LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation, SIGIR 2020 Example In Colab
LCF Graph Convolutional Network for Recommendation with Low-pass Collaborative Filters
VAECF Variational autoencoders for collaborative filtering, WWW 2018
UserKNN User-based K-Nearest Neighbour Recommender in Factorization Meets the Neighborhood: a Multifaceted Collaborative Filtering, KDD 2008
ItemKNN Item-based K-Nearest Neighbour Recommender in Item-Based Collaborative Filtering Recommendation, WWW 2001
MixGCF MixGCF: An Improved Training Method for Graph Neural Network-based Recommender Systems, KDD 2021
UltraGCN UltraGCN: Ultra Simplification of Graph Convolutional Networks for Recommendation, CIKM 2021
SGL Self-supervised Graph Learning for Recommendation, SIGIR 2021
SimGCL Are Graph Augmentations Necessary? Simple Graph Contrastive Learning for Recommendation, SIGIR 2022

Sequential Models

Model Paper Colab
NARM Neural Attentive Session-based Recommendation, CIKM 2017
Caser Personalized Top-N Sequential Recommendation via Convolutional Sequence Embedding, WSDM 2018
GRU4Rec Session-based recommendations with recurrent neural networks, ICLR 2016
SasRec Self-attentive sequential recommendation. ICDM 2018 Example In Colab
MARank Multi-Order Attentive Ranking Model for Sequential Recommendation, AAAI 2019
NextItnet A Simple Convolutional Generative Network for Next Item Recommendation, WSDM 2019
BERT4Rec BERT4Rec: Sequential recommendation with bidirectional encoder representations from transformer, CIKM 2019
TiSASRec Time Interval Aware Self-Attention for Sequential Recommendation. WSDM'20

Recommendation Models with Auxiliary information


Model Paper Colab
Triple2vec Representing and recommending shopping baskets with complementarity, compatibility and loyalty, CIKM 2018 Example In Colab
VBCAR Variational Bayesian Context-aware Representation for Grocery Recommendation, arXiv’ 2019 Example In Colab
TVBR Temporal Variational Bayesian Representation Learning for Grocery Recommendation Example In Colab

Knowledge Graph

If you want your model to be implemented by our maintenance team (or by yourself), please submit an issue following our community instruction.

Recent Changing Logs ---> See version release.


This project welcomes contributions and suggestions. Please make sure to read the Contributing Guide before creating a pull request.

Community meeting

  • Meeting time: Saturday (1:30 – 2:30pm UTC +0) / (9:30 – 10:30pm UTC +8) Add Event
  • Meeting minutes: notes
  • Meeting recordings: [recording links]: Can be found in each meeting note.

Discussion channels

  • Slack: Slack Status
  • Mailing list: TBC


If you use Beta-RecSys in you research, we would appreciate citations to the following paper:

  title={BETA-Rec: Build, Evaluate and Tune Automated Recommender Systems},
  author={Meng, Zaiqiao and McCreadie, Richard and Macdonald, Craig and Ounis, Iadh and Liu, Siwei and Wu, Yaxiong and Wang, Xi and Liang, Shangsong and Liang, Yucheng and Zeng, Guangtao and others},
  booktitle={Fourteenth ACM Conference on Recommender Systems},