# JustGlowing/minisom

🔴 MiniSom is a minimalistic implementation of the Self Organizing Maps
Latest commit 99b231d Jul 18, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information. examples Jul 18, 2019 .gitignore Sep 12, 2018 Readme.md Jul 8, 2019 minisom.py Jul 18, 2019 setup.cfg Jun 7, 2017 setup.py Feb 20, 2019

# MiniSom ## Self Organizing Maps

MiniSom is a minimalistic and Numpy based implementation of the Self Organizing Maps (SOM). SOM is a type of Artificial Neural Network able to convert complex, nonlinear statistical relationships between high-dimensional data items into simple geometric relationships on a low-dimensional display.

## Installation

Just use pip:

``````pip install minisom
``````

``````git clone https://github.com/JustGlowing/minisom.git
python setup.py install
``````

## How to use it

In order to use MiniSom you need your data organized as a Numpy matrix where each row corresponds to an observation or as list of lists like the following:

```data = [[ 0.80,  0.55,  0.22,  0.03],
[ 0.82,  0.50,  0.23,  0.03],
[ 0.80,  0.54,  0.22,  0.03],
[ 0.80,  0.53,  0.26,  0.03],
[ 0.79,  0.56,  0.22,  0.03],
[ 0.75,  0.60,  0.25,  0.03],
[ 0.77,  0.59,  0.22,  0.03]]      ```

Then you can run MiniSom just as follows:

```from minisom import MiniSom
som = MiniSom(6, 6, 4, sigma=0.3, learning_rate=0.5) # initialization of 6x6 SOM
som.train_random(data, 100) # trains the SOM with 100 iterations```

MiniSom implements two types of training. The random training (implemented by the method `train_random`), where the model is trained picking random samples from your data, and the batch training (implemented by the method `train_batch`), where the samples are picked in the order they are stored.

The weights of the network are randmly initialized by default. Two additional methods are provided to initialize the weights in a data driven fashion: `random_weights_init` and `pca_weights_init`.

### Using the trained SOM

After the training you will be able to

• Compute the coordinate assigned to an observation `x` on the map with the method `winner(x)`.
• Compute the average distance map of the weights on the map with the method `distance_map()`.
• Compute the number of times that each neuron have been considered winner for the observations of a new data set with the method `activation_response(data)`.
• Compute the quantization error with the method `quantization_error(data)`.

#### Vector quantization

The data can be quantized by assigning a code book (weights vector of the winning neuron) to each sample in data. This kind of vector quantization is implemented by the method `quantization` that can be called as follows:

`qnt = som.quantization(data)`

In this example we have that `qnt[i]` is the quantized version of `data[i]`.

#### Export a SOM and load it again

A model can be saved using pickle as follows

```import pickle
som = MiniSom(7, 7, 4)

# ...train the som here

# saving the some in the file som.p
with open('som.p', 'wb') as outfile:
pickle.dump(som, outfile)```

and can be loaded as follows

```with open('som.p', 'rb') as infile:

Note that if a lambda function is used to define the decay factor MiniSom will not be pickable anymore.

## Examples

You can find some examples of how to use MiniSom here: https://github.com/JustGlowing/minisom/tree/master/examples

Here are some of the charts you'll see how to generate in the examples:

## Compatibility notes

Minisom has been tested under Python 3.6.2. 