This is a fork from CodeReclaimers.
The main focus is Non-dominated Sorting for Multiobjective Fitness. That means having more than one fitness value that should be optimized.
This is done throught the implementation of NSGA-II as a Reproduction method. More details on the neat/nsga2/
readme.
The current repository also presents a hoverboard game/simulation to be used as a problem for testing the NSGA-II feature, as well as examples for training it with and without NSGA-II.
Check the readme on examples/nsga2
for more details.
I've tried keeping the minimal amount of change to the core library, so merging to the main fork should be easy. All these changes are backwards-compatible.
NEAT (NeuroEvolution of Augmenting Topologies) is a method developed by Kenneth O. Stanley for evolving arbitrary neural networks. This project is a pure-Python implementation of NEAT with no dependencies beyond the standard library. It was forked from the excellent project by @MattKallada, and is in the process of being updated to provide more features and a (hopefully) simpler and documented API.
For further information regarding general concepts and theory, please see Selected Publications on Stanley's website.
neat-python
is licensed under the 3-clause BSD license.
If you want to try neat-python, please check out the repository, start playing with the examples (examples/xor
is
a good place to start) and then try creating your own experiment.
The documentation, is available on Read The Docs.
Here is a Bibtex entry you can use to cite this project in a publication. The listed authors are the maintainers of all iterations of the project up to this point.
@misc{neat-python,
Title = {neat-python},
Author = {Alan McIntyre and Matt Kallada and Cesar G. Miguel and Carolina Feher da Silva},
howpublished = {\url{https://github.com/CodeReclaimers/neat-python}}
}