Skip to content

JAX implementation of NeuroEvolution of Augmenting Topologies (Neat)

Notifications You must be signed in to change notification settings

RPegoud/neat-jax

Repository files navigation

neatJax: Fast NeuroEvolution of Augmenting Topologies 🪸

Issues Issues Code style: black

JAX implementation of the Neat (Evolving Neural Networks through Augmenting Topologies) algorithm.

🚀 TO DO

Forward Pass:

  • Add connection weights
  • Add individual activation functions
  • Add conditional activation of output nodes, return output values
  • Test forward when a single neuron is linked to multiple receivers
  • Test forward pass on larger architectures

Mutations:

  • Determine mutation frequency and common practices
  • Implement the following mutations:
    • Weight shift
    • Weight reset
    • Add node
    • Add connection
    • Mutate activation
  • Wrap all mutations in a single function

Misc:

  • Add Hydra config for constant attributes
  • Separate max_nodes and max_connections
  • Add bias
  • Set the minimum sender index to 1 instead of 0

Crossing:

  • Add novelty fields to Network dataclass
  • Implement crossing for two simple networks
  • Create a Species dataclass
  • Define a distance metrics between networks to cluster species

📝 References

About

JAX implementation of NeuroEvolution of Augmenting Topologies (Neat)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published