Permalink
Commits on Oct 21, 2016
@charmasaur charmasaur Tweaks to tutorials documentation ea14b2f
Commits on Oct 22, 2016
@charmasaur charmasaur Fix setup syntax error 3c97b8f
Commits on Nov 02, 2016
@michaelhush Merge pull request #12 from charmasaur/patch-1
Documentation and setup tweaks
dfb5cd3
Commits on Nov 03, 2016
mhush Added additional tests for halting conditions.
Fixed bug with GP fitting data with bad runs.
1897106
Commits on Nov 04, 2016
@michaelhush Fixed halting conditions
Previously the training runs had to be completed before M-LOOP would
halt. This lead to unintuitive behavior when the halting conditions
were early on in the optimization process.

M-LOOP now halts immediately when any of the halting conditions are
met.
cfa5748
@michaelhush Merge pull request #14 from michaelhush/fixbad
Fixed halting conditions and bad flags
8e7cff7
@michaelhush v2.1.1 Candidate
Updated the documentation.
Candidate for new version to be released on PyPI
58577fd
Commits on Nov 24, 2016
@michaelhush Update to test and utilities
Added some updates to docstrings and test unit parameters.
baa5074
@michaelhush Added a shell for the nerual net
Added a controller and learner for the neural net.

Also added a new class MachineLearnerController which GaussianProcess
and NeuralNet both inherit from.

I broke the visualizations for GPs in this update. But all the tests
work.
ecffda8
Commits on Nov 25, 2016
@charmasaur charmasaur Fix some whitespace errors
Git complains to me about them when I touch nearby lines, so I figured
it was easier just to fix them.
5f48778
@charmasaur charmasaur Fix some minor controller documentation errors 326f98b
@charmasaur charmasaur Tweaks to NN learner shell 635a5f7
@charmasaur charmasaur Remove unnecessary uncertainty stuff from NNL 6a6f663
Commits on Nov 30, 2016
@michaelhush Added visualization introduced bug
Visualizations now work for NN and GP learners.

Mysterious bug has appeared in GP. The scikit-learn stops providing
uncertainty predictions after being fit for a certain number of times.

Commiting so I can change branch and investigate.
97d5b23
Commits on Dec 01, 2016
@michaelhush NerualNet ready for actually net
There appears to be some issues with multiprocessing and gaussian
process but only on MacOS, and possibly just my machine. So I’ve
removed all the testing statements I had in the previous commit.

Branch should be ready now to integrate in a genuine NN.
e8a8715
@charmasaur charmasaur Fix some NN typos 2efd317
Commits on Dec 02, 2016
@charmasaur charmasaur Basic NN learner implementation
I've pulled the actual network logic out into a new class, to
keep the TF stuff separate from everything else and to keep a
clear separation between what's modelling the landscape and
what's doing prediction.
d5c5749
@charmasaur charmasaur Fix number_of_controllers definition d7b1fca
@charmasaur charmasaur More NNController tidying/tweaking 2126150
@charmasaur charmasaur Remove scaler from NNController d78a661
@charmasaur charmasaur Tidying/logging for NN impl 34b504b
@charmasaur charmasaur Fix importing/creation of NN impl
We need to specify nnlearner as a package. More subtly, because of TF
we can only run NNI in the same process in which it's created. This
means we need to wait until the run() method of the learner is called
before constructing the impl.
9224be5
@charmasaur charmasaur Merge branch 'NeuralNetA' of https://github.com/michaelhush/M-LOOP in…
…to NeuralNetA

Conflicts:
	mloop/controllers.py
	mloop/learners.py
f76c9b2
Commits on Dec 03, 2016
@charmasaur charmasaur Pull NNI construction into create_neural_net be3c8a5
@charmasaur charmasaur Dumb implementation of predict_costs array version 3a46a17
Commits on Dec 04, 2016
@charmasaur charmasaur Set new_params_event in MLC after getting the cost
When generation_num=1, if the new_params_event is set first then the
learner will try to get the cost when the queue is empty, causing an
exception.
89f1e1a
@charmasaur charmasaur Add (trivial) scaler back to NNL 3e4b3df
@charmasaur charmasaur Don't do one last train in order to predict minima
at the end. This was causing an exception to be thrown when trying to
get costs from the queue.
f22c979
Commits on Dec 05, 2016
@michaelhush Merge pull request #15 from charmasaur/NeuralNetA
Adding NN from charmasaur
82fa70a
Commits on Dec 09, 2016
@charmasaur charmasaur Tweak some NNI params to perform better on the test e30906a
@charmasaur charmasaur Still print predicted_best_cost even when predicted_best_uncertainty …
…isn't set
e6d371a
@charmasaur charmasaur Use TF gradient when minimizing NN cost function estimate e6e83e8
@charmasaur charmasaur Plot NN surface when there are 2 params 1900587
@charmasaur charmasaur Merge pull request #16 from charmasaur/NeuralNetA
Get NN working a bit better on the tests
df56ca1
@charmasaur charmasaur Revert "Get NN working a bit better on the tests"
9835e3f
@charmasaur charmasaur Merge pull request #17 from michaelhush/revert-16-NeuralNetA
Revert "Get NN working a bit better on the tests"
99d5c95
Commits on Mar 02, 2017
@michaelhush Previous data files can now be imported
Added support for previous data files to be imported into a gaussian
process learner.
c2f6519
Commits on Mar 24, 2017
@michaelhush Updated bug in visualizations
Fixed a bug where an attribute wasn’t present in the learner class. Was
a problem when attempting to plot the visualizations from a file.
47c16bf
Commits on Mar 29, 2017
@michaelhush Fixed one param visualization bug and typos in documentation
When optimizing one parameter, there were some issues reimporting the
saved files for the visualizations to work. This was due to the
problematic corner case of zero D or one D with one element arrays in
numpy. This has now been sanitized. Also fixed some critical typos in
the documentation.
3bc0374