Skip to content
🔴 MiniSom is a minimalistic implementation of the Self Organizing Maps
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
examples classification notebook Jul 18, 2019
.gitignore fixed test with quantization error, extracted common code to local fu… Sep 12, 2018 new citation Jul 8, 2019 moved classification in examples Jul 18, 2019
setup.cfg released on pypi Jun 7, 2017 minor release update Feb 20, 2019


Self Organizing Maps

MiniSom is a minimalistic and Numpy based implementation of the Self Organizing Maps (SOM). SOM is a type of Artificial Neural Network able to convert complex, nonlinear statistical relationships between high-dimensional data items into simple geometric relationships on a low-dimensional display.


Just use pip:

pip install minisom

or download MiniSom to a directory of your choice and use the setup script:

git clone
python install

How to use it

In order to use MiniSom you need your data organized as a Numpy matrix where each row corresponds to an observation or as list of lists like the following:

data = [[ 0.80,  0.55,  0.22,  0.03],
        [ 0.82,  0.50,  0.23,  0.03],
        [ 0.80,  0.54,  0.22,  0.03],
        [ 0.80,  0.53,  0.26,  0.03],
        [ 0.79,  0.56,  0.22,  0.03],
        [ 0.75,  0.60,  0.25,  0.03],
        [ 0.77,  0.59,  0.22,  0.03]]      

Then you can run MiniSom just as follows:

from minisom import MiniSom    
som = MiniSom(6, 6, 4, sigma=0.3, learning_rate=0.5) # initialization of 6x6 SOM
som.train_random(data, 100) # trains the SOM with 100 iterations

MiniSom implements two types of training. The random training (implemented by the method train_random), where the model is trained picking random samples from your data, and the batch training (implemented by the method train_batch), where the samples are picked in the order they are stored.

The weights of the network are randmly initialized by default. Two additional methods are provided to initialize the weights in a data driven fashion: random_weights_init and pca_weights_init.

Using the trained SOM

After the training you will be able to

  • Compute the coordinate assigned to an observation x on the map with the method winner(x).
  • Compute the average distance map of the weights on the map with the method distance_map().
  • Compute the number of times that each neuron have been considered winner for the observations of a new data set with the method activation_response(data).
  • Compute the quantization error with the method quantization_error(data).

Vector quantization

The data can be quantized by assigning a code book (weights vector of the winning neuron) to each sample in data. This kind of vector quantization is implemented by the method quantization that can be called as follows:

qnt = som.quantization(data)

In this example we have that qnt[i] is the quantized version of data[i].

Export a SOM and load it again

A model can be saved using pickle as follows

import pickle
som = MiniSom(7, 7, 4)

# ...train the som here

# saving the some in the file som.p
with open('som.p', 'wb') as outfile:
    pickle.dump(som, outfile)

and can be loaded as follows

with open('som.p', 'rb') as infile:
    som = pickle.load(infile)

Note that if a lambda function is used to define the decay factor MiniSom will not be pickable anymore.


You can find some examples of how to use MiniSom here:

Here are some of the charts you'll see how to generate in the examples:

Iris map Class assignment
Handwritteng digits mapping Images mapping handwritten digitts recognition
Color quantization Outliers detection

Other tutorials

Who uses Minisom?

Compatibility notes

Minisom has been tested under Python 3.6.2.


MiniSom by Giuseppe Vettigli is licensed under the Creative Commons Attribution 3.0 Unported License. To view a copy of this license, visit


You can’t perform that action at this time.