Neat is a modern, from-scratch Java implementation of NeuroEvolution of Augmenting Topologies (NEAT) as described in the original paper by Kenneth O. Stanley and Risto Miikkulainen, “Evolving Neural Networks through Augmenting Topologies” stanley.ec02.pdf.
The goal of this project is to provide a clean, well-tested, and faithful reference implementation of NEAT for the JVM, suitable both for learning and for use in research and production experiments.
-
Faithful to the original algorithm:
- Historical markings and innovation numbers
- Structural mutations (add-node, add-connection)
- Speciation via compatibility distance and explicit fitness sharing
- Incremental growth from minimal topologies (no hidden nodes initially)
-
Modern Java:
- Targeting Java 25 (LTS)
- Functional-style APIs where useful, immutable value objects for genomes and genes where appropriate
- Optional modules for parallel evaluation (using
ForkJoinPool/ virtual threads in later iterations)
-
Configurable NEAT engine:
- Population size, mutation rates, and speciation parameters configurable at runtime
- Pluggable fitness evaluators and stopping criteria
- Deterministic runs via seedable RNG
-
Examples and benchmarks (aligned with the paper):
- XOR verification task
- Single-pole balancing
- Double-pole balancing with and without velocities
-
Tooling (planned):
- JUnit tests and property-based tests for core components
- Static analysis and code-quality tooling (e.g. Checkstyle, Forbidden APIs, SpotBugs/FindBugs-style checks)
- Mutation testing with PIT/pitest for verifying test robustness
- Simple visualization hooks (species size over time, fitness curves, topology inspection)
NEAT evolves both network weights and topologies. Its performance advantage, as argued in the original paper stanley.ec02.pdf, comes from three main ideas:
- Historical markings: Each structural mutation is tagged with an innovation number, allowing genomes with different topologies to be aligned sensibly during crossover.
- Speciation: Genomes are clustered into species based on a compatibility distance that counts excess, disjoint genes and average weight differences, with explicit fitness sharing to protect innovation.
- Complexification from minimal structure: Evolution starts from networks with no hidden nodes and gradually complexifies only when beneficial, keeping the search space low-dimensional throughout training.
Neat aims to implement these mechanisms as directly and transparently as possible.
As the project evolves, the repository will be organized roughly as follows:
-
core: Core NEAT primitives and engine- Genes, genomes, and networks (nodes, connections)
- Mutation, crossover, and speciation
- Evolution loop and population management
-
examples: Example problems and demos- XOR
- Pole balancing tasks
- Simple control / toy environments
-
visualization(optional / later):- Basic visualizations of species, fitness, and evolved topologies
The initial implementation will likely start as a single module; the above layout is a roadmap, not a guaranteed final structure.
-
Requirements
- Java: 25 (current LTS, recommended)
- Build tool: Gradle with Kotlin DSL (
build.gradle.kts)
-
Clone the repository
git clone https://github.com/your-org/Neat.git
cd Neat- Build and test (Gradle with Kotlin DSL)
./gradlew build
./gradlew testThe intended user experience is:
-
Define a phenotype / evaluation function:
Implement an interface such asFitnessEvaluatorthat, given a decoded network, can evaluate fitness for your problem. -
Configure NEAT:
Create aNeatConfigspecifying:
- Population size
- Mutation probabilities (add-node, add-connection, weight perturbation)
- Speciation parameters (
c1,c2,c3,δt) - Stopping conditions (max generations, target fitness, etc.)
-
Run evolution:
Use aNeatEngineto evolve a population and obtain the best genome / network. -
Decode and use the network:
Convert the best genome into a feedforward or recurrent network and apply it in your environment (control task, game agent, etc.).
A future README update will include concrete Java code examples once the core API has stabilized.
To stay close to the reference implementation, Neat will provide sensible defaults inspired by the paper stanley.ec02.pdf, while still being fully configurable:
-
Population size
- 150 individuals
-
Compatibility distance (\delta)
[ \delta = \frac{c_1 E}{N} + \frac{c_2 D}{N} + c_3 \cdot \overline{W} ]
- (E): number of excess genes
- (D): number of disjoint genes
- (\overline{W}): average weight difference of matching genes
- (N): number of genes in the larger genome (or 1 for small genomes)
- Default coefficients: (c_1 = 1.0), (c_2 = 1.0), (c_3 = 0.4)
-
Speciation threshold
- (\delta_t = 3.0)
-
Mutation rates (subject to tuning):
- High probability of weight mutation per genome (e.g. ~0.8)
- Add-connection mutation probability higher than add-node
- Chance of disabling inherited disabled genes (e.g. 0.75)
All of these will be exposed as configuration options; the defaults will be documented in JavaDocs and in this README.
-
v0.1 – Core NEAT engine
- Genomes, genes, innovation tracking
- Structural mutation, crossover, and speciation
- Minimal feedforward network decoding and evaluation loop
- XOR demo and tests
-
v0.2 – Classic benchmarks
- Single-pole and double-pole balancing environments
- Reproduction of the main benchmarks from the paper (evaluations, generalization checks)
-
v0.3 – Ergonomic APIs and tooling
- Higher-level configuration builder
- Simple charting / logging hooks
- Example integration with a game / control library
-
Later
- Visualization of species and topologies
- Persistence of runs, checkpointing, and replay
- Parallel evaluation and distributed runs
Contributions are welcome, especially around:
- Verifying behavior against the original experiments
- Additional examples and environments
- Performance tuning and parallel evaluation
- Documentation, diagrams, and visualizations
Please open an issue or pull request with a clear description of your change and how it relates to the NEAT algorithm as described in the paper.
This project is directly inspired by and aims to faithfully follow:
- Kenneth O. Stanley and Risto Miikkulainen,
Evolving Neural Networks through Augmenting Topologies,
Evolutionary Computation, 10(2):99–127, 2002.
stanley.ec02.pdf