Skip to content
Branch: master
Find file History
Type Name Latest commit message Commit time
Failed to load latest commit information.
WANN Fix WANNTool link in READMEs Aug 18, 2019
WANNTool clean up Aug 6, 2019
prettyNEAT clean up Aug 6, 2019
prettyNeatWann Fix WANNTool link in READMEs Aug 18, 2019
vae clean up Aug 6, 2019
LICENSE WANN and prettyNEAT code release Aug 5, 2019 clean up Aug 6, 2019

Weight Agnostic Neural Networks

Code to reproduce and extend the experiments in 'Weight Agnostic Neural Networks' by Adam Gaier and David Ha.

This repository is split into 4 parts:

  • WANN: Self-contained code for replicating the experiments in the paper. If you just want to look at the details of the implementation this is the code for you.

  • prettyNEAT: A general implementation of the NEAT algorithm -- used as an inspiration and departure point for WANNs. Performs simultaneous weight and topology optimization. If you want to do your own unrelated neuroevolution experiments with numpy and OpenAI Gym this is the code for you.

  • prettyNeatWann: WANNs implemented as a fork of prettyNEAT -- inherits methods and structures from prettyNEAT. If you want to heavily modify or do extensive experiments with WANNs this is the code for you.

  • WANNTool: If you want to fine tune the weights of an existing WANN and test their performance over many trials, this is the code for you.

Using the VAE Racer environment

The pretrained VAE used in the VAERacer experiments is about 20MB, so rather than include it in every folder we put a single copy in the base directory. To use it copy the contents of vae into the root directory, e.g.

cp -r vae WANN/


For attribution in academic contexts, please cite this work as

  author = {Adam Gaier and David Ha},  
  title  = {Weight Agnostic Neural Networks},  
  eprint = {arXiv:1906.04358},  
  url    = {},  
  note   = "\url{}",  
  year   = {2019}  


This is not an official Google product.

You can’t perform that action at this time.