-
Merge pull request #30 from charmasaur/nntesting
Improved visualization of cost landscape for an ensemble of nets
-
-
Add plot of max/min/mean cross sections
charmasaur committedJun 1, 2017 -
Switch cross sections to be about best found point
(rather than best predicted point)
charmasaur committedJun 1, 2017 -
Merge pull request #29 from charmasaur/NeuralNetA
Train an ensemble of nets simultaneously
-
-
Make sure every net gets trained with each new parameter set
Previously if the number of nets was greater than the generation num then we wouldn't train the extra nets.
charmasaur committedJun 1, 2017 -
charmasaur committed
Jun 1, 2017 -
charmasaur committed
Jun 1, 2017 -
Add script to show visualisations from an archive
charmasaur committedJun 1, 2017 -
charmasaur committed
Jun 1, 2017 -
Support uploading NN cross sections to plotly
charmasaur committedJun 1, 2017
-
Support multiple nets in visualisations
charmasaur committedMay 31, 2017 -
Allow multiple nets to be saved
Previously this didn't really work, because if you constructed multiple nets in quick succession they'd save over each other. Now the filenames have a few random bytes in them, so this shouldn't be a problem any more.
charmasaur committedMay 31, 2017 -
Support loading visualisations from archive
charmasaur committedMay 31, 2017
-
charmasaur committed
May 30, 2017 -
Merge branch 'nntesting' of https://github.com/charmasaur/M-LOOP into…
… nntesting
charmasaur committedMay 30, 2017 -
Move start/stop opt out of debugging methods
charmasaur committedMay 30, 2017 -
Implement get_losses for multiple nets
charmasaur committedMay 30, 2017 -
Bump generation num and nets count up to 3
charmasaur committedMay 30, 2017 -
Support multiple nets at the learner level
This means we can do things like cycle through nets per generation, interleave param generation and training of those nets, and so on. We can probably remove the sampling from the net itself now.
charmasaur committedMay 30, 2017 -
Make NNLearner.fit_neural_net more private
charmasaur committedMay 30, 2017 -
-
Log learner type with run number
charmasaur committedMay 29, 2017 -
On python 2 math.ceil returns a float, and this was making range unhappy.
-
charmasaur committed
May 24, 2017 -
Use mlu.empty_exception instead of queue.Empty
charmasaur committedMay 24, 2017 -
Add tensorflow to requirements and setup.py
charmasaur committedMay 24, 2017
-
Merge pull request #28 from charmasaur/l2
Add "training" logging to all training runs
-
-
Merge pull request #27 from charmasaur/logging
Log learner type with run number
-
-
Merge pull request #26 from charmasaur/nnrangefix
Fix NN exception on python 2
-
On python 2 math.ceil returns a float, and this was making range unhappy.
-
Merge pull request #24 from charmasaur/NeuralNetA
Import neural net changes from charmasaur