Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time
May 21, 2021 10:04
December 8, 2021 17:14
July 21, 2020 23:05
September 5, 2022 18:15
September 14, 2017 13:13
September 14, 2017 11:56
July 24, 2020 08:50
July 23, 2020 15:01
September 14, 2017 11:56
December 5, 2018 13:23
May 17, 2021 10:32
July 23, 2020 15:02
May 24, 2021 20:09

Picard : Preconditioned ICA for Real Data

GHActions PyPI Codecov Downloads

This repository hosts Python/Octave/Matlab code of the Preconditioned ICA for Real Data (Picard) and Picard-O algorithms.

See the documentation.


Picard is an algorithm for maximum likelihood independent component analysis. It shows state of the art speed of convergence, and solves the same problems as the widely used FastICA, Infomax and extended-Infomax, faster.


The parameter ortho choses whether to work under orthogonal constraint (i.e. enforce the decorrelation of the output) or not. It also comes with an extended version just like extended-infomax, which makes separation of both sub and super-Gaussian signals possible. It is chosen with the parameter extended.

  • ortho=False, extended=False: same solution as Infomax
  • ortho=False, extended=True: same solution as extended-Infomax
  • ortho=True, extended=True: same solution as FastICA
  • ortho=True, extended=False: finds the same solutions as Infomax under orthogonal constraint.


We recommend the Anaconda Python distribution.


Picard can be installed with conda-forge. You need to add conda-forge to your conda channels, and then do:

$ conda install python-picard


Otherwise, to install picard, you first need to install its dependencies:

$ pip install numpy matplotlib numexpr scipy

Then install Picard with pip:

$ pip install python-picard

or to get the latest version of the code:

$ pip install git+

If you do not have admin privileges on the computer, use the --user flag with pip. To upgrade, use the --upgrade flag provided by pip.


To check if everything worked fine, you can do:

$ python -c 'import picard'

and it should not give any error message.


The Matlab/Octave version of Picard and Picard-O is available here.


To get started, you can build a synthetic mixed signals matrix:

>>> import numpy as np
>>> N, T = 3, 1000
>>> S = np.random.laplace(size=(N, T))
>>> A = np.random.randn(N, N)
>>> X =, S)

And then use Picard to separate the signals:

>>> from picard import picard
>>> K, W, Y = picard(X)

Picard outputs the whitening matrix, K, the estimated unmixing matrix, W, and the estimated sources Y. It means that Y = WKX

NEW: scikit-learn compatible API

Introducing picard.Picard, which mimics sklearn.decomposition.FastICA behavior:

>>> from sklearn.datasets import load_digits
>>> from picard import Picard
>>> X, _ = load_digits(return_X_y=True)
>>> transformer = Picard(n_components=7)
>>> X_transformed = transformer.fit_transform(X)
>>> X_transformed.shape


These are the dependencies to use Picard:

  • numpy (>=1.8)
  • matplotlib (>=1.3)
  • numexpr (>= 2.0)
  • scipy (>=0.19)

These are the dependencies to run the EEG example:

  • mne (>=0.14)


If you use this code in your project, please cite:

Pierre Ablin, Jean-Francois Cardoso, Alexandre Gramfort
Faster independent component analysis by preconditioning with Hessian approximations
IEEE Transactions on Signal Processing, 2018

Pierre Ablin, Jean-François Cardoso, Alexandre Gramfort
Faster ICA under orthogonal constraint
ICASSP, 2018


New in 0.8 : for the density exp, the default parameter is now alpha = 0.1 instead of alpha = 1.