Commits on Mar 29, 2017
  1. Fixed one param visualization bug and typos in documentation

    When optimizing one parameter, there were some issues reimporting the
    saved files for the visualizations to work. This was due to the
    problematic corner case of zero D or one D with one element arrays in
    numpy. This has now been sanitized. Also fixed some critical typos in
    the documentation.
    committed Mar 29, 2017
Commits on Mar 24, 2017
  1. Updated bug in visualizations

    Fixed a bug where an attribute wasn’t present in the learner class. Was
    a problem when attempting to plot the visualizations from a file.
    committed Mar 24, 2017
Commits on Mar 2, 2017
  1. Previous data files can now be imported

    Added support for previous data files to be imported into a gaussian
    process learner.
    committed Mar 2, 2017
Commits on Dec 9, 2016
  1. Revert "Get NN working a bit better on the tests"

    charmasaur committed on GitHub Dec 9, 2016
Commits on Dec 4, 2016
  1. Don't do one last train in order to predict minima

    at the end. This was causing an exception to be thrown when trying to
    get costs from the queue.
    charmasaur committed Dec 4, 2016
  2. Set new_params_event in MLC after getting the cost

    When generation_num=1, if the new_params_event is set first then the
    learner will try to get the cost when the queue is empty, causing an
    exception.
    charmasaur committed Dec 4, 2016
Commits on Dec 3, 2016
Commits on Dec 2, 2016
  1. Merge branch 'NeuralNetA' of https://github.com/michaelhush/M-LOOP in…

    …to NeuralNetA
    
    Conflicts:
    	mloop/controllers.py
    	mloop/learners.py
    charmasaur committed Dec 2, 2016
  2. Fix importing/creation of NN impl

    We need to specify nnlearner as a package. More subtly, because of TF
    we can only run NNI in the same process in which it's created. This
    means we need to wait until the run() method of the learner is called
    before constructing the impl.
    charmasaur committed Dec 2, 2016
  3. Tidying/logging for NN impl

    charmasaur committed Dec 2, 2016
  4. Basic NN learner implementation

    I've pulled the actual network logic out into a new class, to
    keep the TF stuff separate from everything else and to keep a
    clear separation between what's modelling the landscape and
    what's doing prediction.
    charmasaur committed Dec 2, 2016
Commits on Dec 1, 2016
  1. Fix some NN typos

    charmasaur committed Dec 1, 2016
  2. NerualNet ready for actually net

    There appears to be some issues with multiprocessing and gaussian
    process but only on MacOS, and possibly just my machine. So I’ve
    removed all the testing statements I had in the previous commit.
    
    Branch should be ready now to integrate in a genuine NN.
    committed Dec 1, 2016
Commits on Nov 30, 2016
  1. Added visualization introduced bug

    Visualizations now work for NN and GP learners.
    
    Mysterious bug has appeared in GP. The scikit-learn stops providing
    uncertainty predictions after being fit for a certain number of times.
    
    Commiting so I can change branch and investigate.
    committed Nov 30, 2016
Commits on Nov 25, 2016
  1. Tweaks to NN learner shell

    charmasaur committed Nov 25, 2016
  2. Fix some whitespace errors

    Git complains to me about them when I touch nearby lines, so I figured
    it was easier just to fix them.
    charmasaur committed Nov 25, 2016
Commits on Nov 24, 2016
  1. Added a shell for the nerual net

    Added a controller and learner for the neural net.
    
    Also added a new class MachineLearnerController which GaussianProcess
    and NeuralNet both inherit from.
    
    I broke the visualizations for GPs in this update. But all the tests
    work.
    committed Nov 24, 2016
  2. Update to test and utilities

    Added some updates to docstrings and test unit parameters.
    committed Nov 24, 2016
Commits on Nov 4, 2016
  1. v2.1.1 Candidate

    Updated the documentation.
    Candidate for new version to be released on PyPI
    committed Nov 4, 2016
  2. Fixed halting conditions

    Previously the training runs had to be completed before M-LOOP would
    halt. This lead to unintuitive behavior when the halting conditions
    were early on in the optimization process.
    
    M-LOOP now halts immediately when any of the halting conditions are
    met.
    committed Nov 4, 2016
Commits on Nov 3, 2016
  1. Added additional tests for halting conditions.

    Fixed bug with GP fitting data with bad runs.
    mhush committed Nov 3, 2016
Commits on Oct 13, 2016
  1. Candidate for tag v2.1.0

    Candidate for version 2.1.0 release.
    committed Oct 13, 2016
  2. Documentation update

    Update to documentation complete. Now describes how to use the shell
    interface, the differential evolution optimizer and using M-LOOP as an
    MPI.
    committed Oct 13, 2016
  3. Documentation updated

    Documentation has been updated to explain all the added features and
    also how to use M-LOOP as a python API. Still needs proof reading.
    committed Oct 13, 2016