MacGraph with Population-based Training
Apologies that the training data isn't available - I've yet to find a quick solution to this, when I get the system working on more questions I'll publish a stable "all question" dataset with 1M items. For now, you can easily build you own data - ask David for help.
|Basic MAC cell structure||Complete||Implemented as per paper, now diverging to achieve below objectives|
|Recall station (node) properties||Complete||99.9% accuracy after 10k training steps|
|Answer if stations adjacent||Complete||99% accuracy after 20k training steps|
|Stations N apart||Semi-complete||98% accuracy up to ~9 apart after 25k training steps|
|Station existence||Complete||99.9% accuracy after 30k training steps|
|Station with property adjacent||Complete||98.8% accuracy after 30k training steps `164ddc2`|
|Station adjacent to two other stations||In progress||98% accuracy after 20k training steps|
For more in-depth information about what works/doesn't work, check out the experiment log.
Running the code
See RUNNING.md for how to both run the network and also use a cluster to optimize its hyper-parameters.
The short summary of how to train locally:
$ pipenv install $ pipenv shell (mac-graph-sjOzWQ6Y) $ python -m macgraph.train
Thanks to Drew Hudson and Christopher Manning for publishing their work, Compositional Attention Networks for Machine Reasoning upon which this is based. Thanks also to DeepMind for publishing their Differentiable Neural Computer results in Nature with a demonstration of that architecture solving graph problems, it is a reassurance that this endeavor is not ill-founded.
Since you're here.
There once was an old man of Esser,
Whose knowledge grew lesser and lesser,
It at last grew so small
He knew nothing at all
And now he's a college professor.