The LISSOM family of self-organizing computational models aims to replicate the detailed development of the visual cortex of humans.
PyLissom is a Pytorch extension implementing the LISSOM networks. It's split in two parts: the core nn and optim packages, which implement the LISSOM network itself, and the datasets, models, and utils packages.
Some of the datasets, models and utils of PyLissom were inspired by Topographica, a former implementation of the LISSOM networks oriented in its design to the neuroscience community. Instead, PyLissom was designed for a hybrid use case of the machine learning and the neuroscience communities.
The library and API documentation are at: https://pylissom.readthedocs.io/, you should check it out for a high level overview. There is an UML class diagram for reference. For hands-on examples there are jupyter notebooks at notebooks/
. If Github is not rendering them, we leave these links at your disposal:
Orientation Maps and pylissom tools
The main features provided by pylissom are:
-
LISSOM's activation
-
LISSOM's hebbian learning mechanism and others
-
Configuration and model building tools
-
Common Guassian stimuli for LISSOM experiments
-
Plotting helpers
-
Training pipeline objects
You should first install PyTorch with conda as explained at: https://pytorch.org/
Then you can install PyLissom by running:
pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple pylissom
The code is hosted in pypi: https://test.pypi.org/project/pylissom/
The tests are in the tests/
folder, and can be run with pytest
. Also, the repository has Travis CI enabled, meaning every commit and Pull Request runs the tests in a virtualenv, showing as green checkmarks and red crosses in the PR page. These are all the integrations links of the repo:
Travis - Continuous Integration: repo_page
Codecov - Code coverage: repo_page
Scrutinizer - Code health: repo_page
CodeClimate - Maintainability: repo_page
ReadTheDocs - Documentation: repo_page
For any questions please contact the repo collaborators.
The project is licensed under the GPLv3 license.