Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

Adding NN from charmasaur #15

Merged
merged 17 commits into from Dec 5, 2016

Conversation

Projects
None yet
2 participants
Owner

michaelhush commented Dec 5, 2016

No description provided.

charmasaur added some commits Nov 25, 2016

@charmasaur charmasaur Fix some whitespace errors
Git complains to me about them when I touch nearby lines, so I figured
it was easier just to fix them.
5f48778
@charmasaur charmasaur Fix some minor controller documentation errors 326f98b
@charmasaur charmasaur Tweaks to NN learner shell 635a5f7
@charmasaur charmasaur Remove unnecessary uncertainty stuff from NNL 6a6f663
@charmasaur charmasaur Fix some NN typos 2efd317
@charmasaur charmasaur Basic NN learner implementation
I've pulled the actual network logic out into a new class, to
keep the TF stuff separate from everything else and to keep a
clear separation between what's modelling the landscape and
what's doing prediction.
d5c5749
@charmasaur charmasaur Fix number_of_controllers definition d7b1fca
@charmasaur charmasaur More NNController tidying/tweaking 2126150
@charmasaur charmasaur Remove scaler from NNController d78a661
@charmasaur charmasaur Tidying/logging for NN impl 34b504b
@charmasaur charmasaur Fix importing/creation of NN impl
We need to specify nnlearner as a package. More subtly, because of TF
we can only run NNI in the same process in which it's created. This
means we need to wait until the run() method of the learner is called
before constructing the impl.
9224be5
@charmasaur charmasaur Merge branch 'NeuralNetA' of https://github.com/michaelhush/M-LOOP in…
…to NeuralNetA

Conflicts:
	mloop/controllers.py
	mloop/learners.py
f76c9b2
@charmasaur charmasaur Pull NNI construction into create_neural_net be3c8a5
@charmasaur charmasaur Dumb implementation of predict_costs array version 3a46a17
@charmasaur charmasaur Set new_params_event in MLC after getting the cost
When generation_num=1, if the new_params_event is set first then the
learner will try to get the cost when the queue is empty, causing an
exception.
89f1e1a
@charmasaur charmasaur Add (trivial) scaler back to NNL 3e4b3df
@charmasaur charmasaur Don't do one last train in order to predict minima
at the end. This was causing an exception to be thrown when trying to
get costs from the queue.
f22c979

@michaelhush michaelhush merged commit 82fa70a into michaelhush:NeuralNetA Dec 5, 2016

1 check was pending

continuous-integration/travis-ci/pr The Travis CI build is in progress
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment