Skip to content

jamborta/tffm2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This is a TensorFlow 2.0 implementation of an arbitrary order (>=2) Factorization Machine based on paper Factorization Machines with libFM.

It supports:

  • different (gradient-based) optimization methods
  • classification/regression via different loss functions (logistic and mse implemented)

The inference time is linear with respect to the number of features.

Tested on Python3.6

Dependencies

Installation

Stable version can be installed via pip install tffm2.

Usage

The interface is similar to scikit-learn models. To train a 6-order FM model with rank=10 for 100 iterations with learning_rate=0.01 use the following sample

from tffm2 import TFFMClassifier
model = TFFMClassifier(
    order=6,
    rank=10,
    optimizer=tf.keras.optimizers.Adam(learning_rate=0.00001),
    n_epochs=100,
    init_std=0.001
)
model.fit(X_tr, y_tr, show_progress=True)

See example.ipynb and gpu_benchmark.ipynb for more details.

It's highly recommended to read tffm/core.py for help.

Testing

Run python test.py from the terminal.

Reference

This code is ported from https://github.com/geffy/tffm

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published