Skip to content


Repository files navigation


TensorLy is a Python library that aims at making tensor learning simple and accessible. It allows to easily perform tensor decomposition, tensor learning and tensor algebra. Its backend system allows to seamlessly perform computation with NumPy, PyTorch, JAX, TensorFlow or CuPy, and run methods at scale on CPU or GPU.

Installing TensorLy

The only pre-requisite is to have Python 3 installed. The easiest way is via the Anaconda distribution.

With pip (recommended) With conda
pip install -U tensorly
conda install -c tensorly tensorly
Development (from git)
# clone the repository
git clone
cd tensorly
# Install in editable mode with `-e` or, equivalently, `--editable`
pip install -e .

Note: TensorLy depends on NumPy by default. If you want to use other backends, you will need to install these packages separately.

For detailed instruction, please see the documentation.


Creating tensors

Create a small third order tensor of size 3 x 4 x 2, from a NumPy array and perform simple operations on it:

import tensorly as tl
import numpy as np

tensor = tl.tensor(np.arange(24).reshape((3, 4, 2)), dtype=tl.float64)
unfolded = tl.unfold(tensor, mode=0)
tl.fold(unfolded, mode=0, shape=tensor.shape)

You can also create random tensors:

from tensorly import random

# A random tensor
tensor = random.random_tensor((3, 4, 2))
# A random CP tensor in factorized form
cp_tensor = random.random_tensor(shape=(3, 4, 2), rank='same')

You can also create tensors in TT-format, Tucker, etc, see random tensors.

Setting the backend

You can change the backend to perform computation with a different framework. By default, the backend is NumPy, but you can also perform the computation using PyTorch, TensorFlow, JAX or CuPy (requires to have installed them first). For instance, after setting the backend to PyTorch, all the computation is done by PyTorch, and tensors can be created on GPU:

tl.set_backend('pytorch') # Or 'numpy', 'tensorflow', 'cupy' or 'jax'
tensor = tl.tensor(np.arange(24).reshape((3, 4, 2)), device='cuda:0')
type(tensor) # torch.Tensor

Tensor decomposition

Applying tensor decomposition is easy:

from tensorly.decomposition import tucker
# Apply Tucker decomposition
tucker_tensor = tucker(tensor, rank=[2, 2, 2])
# Reconstruct the full tensor from the decomposed form

We have many more decompositions available, be sure to check them out!

Next steps

This is just a very quick introduction to some of the basic features of TensorLy. For more information on getting started, checkout the user-guide and for a detailed reference of the functions and their documentation, refer to the API

If you see a bug, open an issue, or better yet, a pull-request!

Contributing code

All contributions are welcome! So if you have a cool tensor method you want to add, if you spot a bug or even a typo or mistake in the documentation, please report it, and even better, open a Pull-Request on GitHub.

Before you submit your changes, you should make sure your code adheres to our style-guide. The easiest way to do this is with black:

pip install black
black .

Running the tests

Testing and documentation are an essential part of this package and all functions come with uni-tests and documentation.

The tests are ran using the pytest package. First install pytest:

pip install pytest

Then to run the test, simply run, in the terminal:

pytest -v tensorly

Alternatively, you can specify for which backend you wish to run the tests:

TENSORLY_BACKEND='numpy' pytest -v tensorly


If you use TensorLy in an academic paper, please cite [1]:

  author  = {Jean Kossaifi and Yannis Panagakis and Anima Anandkumar and Maja Pantic},
  title   = {TensorLy: Tensor Learning in Python},
  journal = {Journal of Machine Learning Research},
  year    = {2019},
  volume  = {20},
  number  = {26},
  pages   = {1-6},
  url     = {}
[1]Jean Kossaifi, Yannis Panagakis, Anima Anandkumar and Maja Pantic, TensorLy: Tensor Learning in Python, Journal of Machine Learning Research (JMLR), 2019, volume 20, number 26.