Permalink
...
Comparing changes
Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also .
Open a pull request
Create a new pull request by comparing changes across two branches. If you need to, you can also .
Choose a Base Repository
michaelhush/M-LOOP
AIAdventures/M-LOOP
AgentJay/M-LOOP
CadeLaRen/M-LOOP
Davidjohanneslars/M-LOOP
PUSH2DREAM/M-LOOP
Python3pkg/M-LOOP
agultiga/M-LOOP
batrobinyuen/M-LOOP
carmelom/M-LOOP
charmasaur/M-LOOP
ianblenke/M-LOOP
jsonbao/M-LOOP
kel85uk/M-LOOP
ml-lab/M-LOOP
psenger/M-LOOP
runngezhang/M-LOOP
smartmanp/M-LOOP
tadashi-nakamura/M-LOOP
tomzhang/M-LOOP
trietnm2/M-LOOP
yangchenzhong/M-LOOP
zhao9jack/M-LOOP
Nothing to show
Choose a Head Repository
michaelhush/M-LOOP
AIAdventures/M-LOOP
AgentJay/M-LOOP
CadeLaRen/M-LOOP
Davidjohanneslars/M-LOOP
PUSH2DREAM/M-LOOP
Python3pkg/M-LOOP
agultiga/M-LOOP
batrobinyuen/M-LOOP
carmelom/M-LOOP
charmasaur/M-LOOP
ianblenke/M-LOOP
jsonbao/M-LOOP
kel85uk/M-LOOP
ml-lab/M-LOOP
psenger/M-LOOP
runngezhang/M-LOOP
smartmanp/M-LOOP
tadashi-nakamura/M-LOOP
tomzhang/M-LOOP
trietnm2/M-LOOP
yangchenzhong/M-LOOP
zhao9jack/M-LOOP
Nothing to show
3
contributors
Commits on Sep 15, 2016
|
|
michaelhush |
Fixed indentation error
There was a small indentation error in the M-LOOP executable. This has been fixed. |
a63e40c
|
Commits on Sep 22, 2016
|
|
michaelhush |
Added command line interface
M-LOOP now supports running experiments through commands on the computer’s shell. Example has been added, documentation does not include this change yet. |
801b304
|
|||
|
|
michaelhush |
Revert "Added command line interface"
This reverts commit 801b304. |
2256b1a
|
|||
|
|
michaelhush |
Revert "Revert "Added command line interface""
This reverts commit 2256b1a. |
83cb1e8
|
Commits on Oct 04, 2016
|
|
michaelhush |
Command Line now supported on Windows
Command Line now works on windows as well as Linux and MacOS |
6f10138
|
|||
|
|
michaelhush |
scikit-learn now required
Sciki-learn 0.18 has now officially been released as the new stable version. It is also available in the anaconda distribution. We have removed the scikit-learn code that was included with M-LOOP previously and now have it has a dependency. |
746e864
|
|||
|
|
michaelhush |
Attempt to get travis CI working
Added requirements file and updated travis to make more conservative tests. |
7555f10
|
|||
|
|
michaelhush |
Added a later version of pip requirement
Hopefully this fixes travis CI |
9c357d3
|
|||
|
|
michaelhush |
Travis CI issue identified
The problem with Travis CI is it uses a very old version of pip when testing on python 2.7. Attempted to add a command to upgrade pip first before running installation. |
3a7a1fc
|
|||
|
|
michaelhush |
Final attempt to fix Travis CI
Travis CI is basically broken for python 2.7. Still trying to fix it. |
925c88d
|
|||
|
|
michaelhush |
Added pip requires to setuptools
Setuptools updated |
ec9fb4c
|
|||
|
|
michaelhush |
An upgrade to get Travis CI working
Travis test 2.7 now works, but only after updating to 2.7.12 (why on earth Travis CI defaults to 2.7.9 is beyond me. ) but trying to update 3.5 creates an error, attempted to add a conditional test to get around this. |
667efd4
|
|||
|
|
michaelhush |
Travis updates
Added $ to variable name |
fb3e54c
|
|||
|
|
michaelhush |
Travis CI correction
Copied syntax for conditional test from facebook example. Hopefully will work now. |
680fd52
|
|||
|
|
michaelhush |
Travis CI defining python version
Attempting to define the correct python version explicitly. |
2079874
|
Commits on Oct 05, 2016
|
|
michaelhush |
Adding differential evolution
Started modifying the differential evolution code from scipy for use in M-LOOP. |
3f71bae
|
Commits on Oct 06, 2016
|
|
michaelhush |
Further refining the differential evolution
Continuing the implementation. |
5e089a2
|
Commits on Oct 10, 2016
|
|
michaelhush |
First version of DEComplete
First version of differential evolution complete. Next testing will be investigated |
7788a8b
|
Commits on Oct 11, 2016
|
|
michaelhush |
DE learner complete and added tests.
The differential evolution controller has been implemented and is now set at the default trainer for gaussian process. Tests have been added to the automated suite and there is some basic visualizations available. Still have to debug a possible issue with GP andexamples logging and extras. |
5084b90
|
Commits on Oct 12, 2016
|
|
michaelhush |
Completed Differential Evolution
Differential Evolution now been added to M-LOOP and is set to the default trainer for the gaussian process. Tests and examples have been added. The installation section of the documentation has also been updated. |
4c44e1f
|
|||
|
|
michaelhush |
Merge pull request #10 from michaelhush/DiffEvo
Differential evolution added to M-LOOP. |
be1dca8
|
|||
|
|
michaelhush |
Small update to TravisCI config
Deleted unnecessary line from TravisCI config. |
c834172
|
|||
|
|
michaelhush |
Updating the documentation
Updating the documentation to prepare for the 2.1.0 release. |
88ec082
|
Commits on Oct 13, 2016
|
|
michaelhush |
Documentation updated
Documentation has been updated to explain all the added features and also how to use M-LOOP as a python API. Still needs proof reading. |
7b57dde
|
|||
|
|
michaelhush |
Documentation update
Update to documentation complete. Now describes how to use the shell interface, the differential evolution optimizer and using M-LOOP as an MPI. |
d115775
|
|||
|
|
michaelhush |
Merge pull request #11 from michaelhush/docupdate
Docupdate |
d1e4ed4
|
|||
|
|
michaelhush |
Candidate for tag v2.1.0
Candidate for version 2.1.0 release. |
d3e96f6
|
Commits on Oct 21, 2016
|
|
charmasaur |
Tweaks to tutorials documentation
|
ea14b2f
|
Commits on Oct 22, 2016
|
|
charmasaur |
Fix setup syntax error
|
3c97b8f
|
Commits on Nov 02, 2016
|
|
michaelhush |
Merge pull request #12 from charmasaur/patch-1
Documentation and setup tweaks |
dfb5cd3
|
Commits on Nov 03, 2016
|
|
mhush |
Added additional tests for halting conditions.
Fixed bug with GP fitting data with bad runs. |
1897106
|
Commits on Nov 04, 2016
|
|
michaelhush |
Fixed halting conditions
Previously the training runs had to be completed before M-LOOP would halt. This lead to unintuitive behavior when the halting conditions were early on in the optimization process. M-LOOP now halts immediately when any of the halting conditions are met. |
cfa5748
|
|||
|
|
michaelhush |
Merge pull request #14 from michaelhush/fixbad
Fixed halting conditions and bad flags |
8e7cff7
|
|||
|
|
michaelhush |
v2.1.1 Candidate
Updated the documentation. Candidate for new version to be released on PyPI |
58577fd
|
Commits on Nov 24, 2016
|
|
michaelhush |
Update to test and utilities
Added some updates to docstrings and test unit parameters. |
baa5074
|
|||
|
|
michaelhush |
Added a shell for the nerual net
Added a controller and learner for the neural net. Also added a new class MachineLearnerController which GaussianProcess and NeuralNet both inherit from. I broke the visualizations for GPs in this update. But all the tests work. |
ecffda8
|
Commits on Nov 25, 2016
|
|
charmasaur |
Fix some whitespace errors
Git complains to me about them when I touch nearby lines, so I figured it was easier just to fix them. |
5f48778
|
|||
|
|
charmasaur |
Fix some minor controller documentation errors
|
326f98b
|
|||
|
|
charmasaur |
Tweaks to NN learner shell
|
635a5f7
|
|||
|
|
charmasaur |
Remove unnecessary uncertainty stuff from NNL
|
6a6f663
|
Commits on Nov 30, 2016
|
|
michaelhush |
Added visualization introduced bug
Visualizations now work for NN and GP learners. Mysterious bug has appeared in GP. The scikit-learn stops providing uncertainty predictions after being fit for a certain number of times. Commiting so I can change branch and investigate. |
97d5b23
|
Commits on Dec 01, 2016
|
|
michaelhush |
NerualNet ready for actually net
There appears to be some issues with multiprocessing and gaussian process but only on MacOS, and possibly just my machine. So I’ve removed all the testing statements I had in the previous commit. Branch should be ready now to integrate in a genuine NN. |
e8a8715
|
|||
|
|
charmasaur |
Fix some NN typos
|
2efd317
|
Commits on Dec 02, 2016
|
|
charmasaur |
Basic NN learner implementation
I've pulled the actual network logic out into a new class, to keep the TF stuff separate from everything else and to keep a clear separation between what's modelling the landscape and what's doing prediction. |
d5c5749
|
|||
|
|
charmasaur |
Fix number_of_controllers definition
|
d7b1fca
|
|||
|
|
charmasaur |
More NNController tidying/tweaking
|
2126150
|
|||
|
|
charmasaur |
Remove scaler from NNController
|
d78a661
|
|||
|
|
charmasaur |
Tidying/logging for NN impl
|
34b504b
|
|||
|
|
charmasaur |
Fix importing/creation of NN impl
We need to specify nnlearner as a package. More subtly, because of TF we can only run NNI in the same process in which it's created. This means we need to wait until the run() method of the learner is called before constructing the impl. |
9224be5
|
|||
|
|
charmasaur |
Merge branch 'NeuralNetA' of https://github.com/michaelhush/M-LOOP in…
…to NeuralNetA Conflicts: mloop/controllers.py mloop/learners.py |
f76c9b2
|
Commits on Dec 03, 2016
|
|
charmasaur |
Pull NNI construction into create_neural_net
|
be3c8a5
|
|||
|
|
charmasaur |
Dumb implementation of predict_costs array version
|
3a46a17
|
Commits on Dec 04, 2016
|
|
charmasaur |
Set new_params_event in MLC after getting the cost
When generation_num=1, if the new_params_event is set first then the learner will try to get the cost when the queue is empty, causing an exception. |
89f1e1a
|
|||
|
|
charmasaur |
Add (trivial) scaler back to NNL
|
3e4b3df
|
|||
|
|
charmasaur |
Don't do one last train in order to predict minima
at the end. This was causing an exception to be thrown when trying to get costs from the queue. |
f22c979
|
Commits on Dec 05, 2016
|
|
michaelhush |
Merge pull request #15 from charmasaur/NeuralNetA
Adding NN from charmasaur |
82fa70a
|
Commits on Dec 09, 2016
|
|
charmasaur |
Tweak some NNI params to perform better on the test
|
e30906a
|
|||
|
|
charmasaur |
Still print predicted_best_cost even when predicted_best_uncertainty …
…isn't set |
e6d371a
|
|||
|
|
charmasaur |
Use TF gradient when minimizing NN cost function estimate
|
e6e83e8
|
|||
|
|
charmasaur |
Plot NN surface when there are 2 params
|
1900587
|
|||
|
|
charmasaur |
Merge pull request #16 from charmasaur/NeuralNetA
Get NN working a bit better on the tests |
df56ca1
|
|||
|
|
charmasaur |
Revert "Get NN working a bit better on the tests"
|
9835e3f
|
|||
|
|
charmasaur |
Merge pull request #17 from michaelhush/revert-16-NeuralNetA
Revert "Get NN working a bit better on the tests" |
99d5c95
|
Commits on Mar 02, 2017
|
|
michaelhush |
Previous data files can now be imported
Added support for previous data files to be imported into a gaussian process learner. |
c2f6519
|
Commits on Mar 24, 2017
|
|
michaelhush |
Updated bug in visualizations
Fixed a bug where an attribute wasn’t present in the learner class. Was a problem when attempting to plot the visualizations from a file. |
47c16bf
|
Commits on Mar 29, 2017
|
|
michaelhush |
Fixed one param visualization bug and typos in documentation
When optimizing one parameter, there were some issues reimporting the saved files for the visualizations to work. This was due to the problematic corner case of zero D or one D with one element arrays in numpy. This has now been sanitized. Also fixed some critical typos in the documentation. |
3bc0374
|
Unified
Split
Showing
with
1,713 additions
and 214,103 deletions.
- +6 −5 .travis.yml
- +1 −1 docs/api/controllers.rst
- +1 −0 docs/api/index.rst
- +1 −1 docs/api/interfaces.rst
- +1 −1 docs/api/launchers.rst
- +1 −1 docs/api/learners.rst
- +1 −1 docs/api/mloop.rst
- +1 −1 docs/api/t_esting.rst
- +1 −1 docs/api/utilities.rst
- +1 −1 docs/api/visualizations.rst
- +2 −2 docs/conf.py
- +6 −5 docs/contributing.rst
- +4 −3 docs/data.rst
- +37 −11 docs/examples.rst
- +8 −9 docs/index.rst
- +79 −18 docs/install.rst
- +57 −5 docs/interfaces.rst
- +0 −205 docs/tutorial.rst
- +478 −0 docs/tutorials.rst
- +2 −1 docs/visualizations.rst
- +19 −0 examples/differential_evolution_complete_config.txt
- +15 −0 examples/differential_evolution_simple_config.txt
- +74 −0 examples/python__controlled_experiment.py
- +6 −0 examples/shell_interface_config.txt
- +10 −3 examples/tutorial_config.txt
- +2 −2 mloop/__init__.py
- +18 −20 bin/M-LOOP → mloop/cmd.py
- +91 −45 mloop/controllers.py
- +108 −10 mloop/interfaces.py
- +0 −3 mloop/launchers.py
- +339 −51 mloop/learners.py
- +0 −71 mloop/localsklearn/__init__.py
- +0 −510 mloop/localsklearn/base.py
- +0 −117 mloop/localsklearn/exceptions.py
- +0 −7 mloop/localsklearn/externals/README
- +0 −5 mloop/localsklearn/externals/__init__.py
- +0 −28 mloop/localsklearn/externals/copy_joblib.sh
- +0 −818 mloop/localsklearn/externals/funcsigs.py
- +0 −137 mloop/localsklearn/externals/joblib/__init__.py
- +0 −20 mloop/localsklearn/externals/joblib/_compat.py
- +0 −105 mloop/localsklearn/externals/joblib/_memory_helpers.py
- +0 −39 mloop/localsklearn/externals/joblib/_multiprocessing_helpers.py
- +0 −356 mloop/localsklearn/externals/joblib/_parallel_backends.py
- +0 −106 mloop/localsklearn/externals/joblib/disk.py
- +0 −415 mloop/localsklearn/externals/joblib/format_stack.py
- +0 −355 mloop/localsklearn/externals/joblib/func_inspect.py
- +0 −262 mloop/localsklearn/externals/joblib/hashing.py
- +0 −157 mloop/localsklearn/externals/joblib/logger.py
- +0 −918 mloop/localsklearn/externals/joblib/memory.py
- +0 −112 mloop/localsklearn/externals/joblib/my_exceptions.py
- +0 −577 mloop/localsklearn/externals/joblib/numpy_pickle.py
- +0 −239 mloop/localsklearn/externals/joblib/numpy_pickle_compat.py
- +0 −623 mloop/localsklearn/externals/joblib/numpy_pickle_utils.py
- +0 −779 mloop/localsklearn/externals/joblib/parallel.py
- +0 −615 mloop/localsklearn/externals/joblib/pool.py
- +0 −85 mloop/localsklearn/externals/joblib/testing.py
- +0 −266 mloop/localsklearn/externals/odict.py
- +0 −9 mloop/localsklearn/externals/setup.py
- +0 −577 mloop/localsklearn/externals/six.py
- +0 −23 mloop/localsklearn/gaussian_process/__init__.py
- +0 −284 mloop/localsklearn/gaussian_process/correlation_models.py
- +0 −896 mloop/localsklearn/gaussian_process/gaussian_process.py
- +0 −729 mloop/localsklearn/gaussian_process/gpc.py
- +0 −430 mloop/localsklearn/gaussian_process/gpr.py
- +0 −1,789 mloop/localsklearn/gaussian_process/kernels.py
- +0 −89 mloop/localsklearn/gaussian_process/regression_models.py
- +0 −114 mloop/localsklearn/metrics/__init__.py
- +0 −133 mloop/localsklearn/metrics/base.py
- +0 −1,848 mloop/localsklearn/metrics/classification.py
- +0 −30 mloop/localsklearn/metrics/cluster/__init__.py
- +0 −86 mloop/localsklearn/metrics/cluster/bicluster.py
- +0 −8,145 mloop/localsklearn/metrics/cluster/expected_mutual_info_fast.c
- +0 −71 mloop/localsklearn/metrics/cluster/expected_mutual_info_fast.pyx
- +0 −23 mloop/localsklearn/metrics/cluster/setup.py
- +0 −908 mloop/localsklearn/metrics/cluster/supervised.py
- +0 −258 mloop/localsklearn/metrics/cluster/unsupervised.py
- +0 −1,394 mloop/localsklearn/metrics/pairwise.py
- +0 −25,210 mloop/localsklearn/metrics/pairwise_fast.c
- +0 −79 mloop/localsklearn/metrics/pairwise_fast.pyx
- +0 −762 mloop/localsklearn/metrics/ranking.py
- +0 −491 mloop/localsklearn/metrics/regression.py
- +0 −357 mloop/localsklearn/metrics/scorer.py
- +0 −32 mloop/localsklearn/metrics/setup.py
- +0 −57 mloop/localsklearn/preprocessing/__init__.py
- +0 −92 mloop/localsklearn/preprocessing/_function_transformer.py
- +0 −1,923 mloop/localsklearn/preprocessing/data.py
- +0 −436 mloop/localsklearn/preprocessing/imputation.py
- +0 −813 mloop/localsklearn/preprocessing/label.py
- +0 −420 mloop/localsklearn/utils/__init__.py
- +0 −6,257 mloop/localsklearn/utils/_logistic_sigmoid.c
- +0 −27 mloop/localsklearn/utils/_logistic_sigmoid.pyx
- +0 −8,693 mloop/localsklearn/utils/_random.c
- +0 −14 mloop/localsklearn/utils/_random.pxd
- +0 −303 mloop/localsklearn/utils/_random.pyx
- +0 −508 mloop/localsklearn/utils/_scipy_sparse_lsqr_backport.py
- +0 −1,859 mloop/localsklearn/utils/arpack.py
- +0 −6,135 mloop/localsklearn/utils/arrayfuncs.c
- +0 −64 mloop/localsklearn/utils/arrayfuncs.pyx
- +0 −17 mloop/localsklearn/utils/bench.py
- +0 −185 mloop/localsklearn/utils/class_weight.py
- +0 −85 mloop/localsklearn/utils/deprecation.py
- +0 −1,555 mloop/localsklearn/utils/estimator_checks.py
- +0 −853 mloop/localsklearn/utils/extmath.py
- +0 −23,792 mloop/localsklearn/utils/fast_dict.cpp
- +0 −24 mloop/localsklearn/utils/fast_dict.pxd
- +0 −155 mloop/localsklearn/utils/fast_dict.pyx
- +0 −445 mloop/localsklearn/utils/fixes.py
- +0 −183 mloop/localsklearn/utils/graph.py
- +0 −10,409 mloop/localsklearn/utils/graph_shortest_path.c
- +0 −610 mloop/localsklearn/utils/graph_shortest_path.pyx
- +0 −2,046 mloop/localsklearn/utils/lgamma.c
- +0 −1 mloop/localsklearn/utils/lgamma.pxd
- +0 −8 mloop/localsklearn/utils/lgamma.pyx
- +0 −284 mloop/localsklearn/utils/linear_assignment_.py
- +0 −72 mloop/localsklearn/utils/metaestimators.py
- +0 −70 mloop/localsklearn/utils/mocking.py
- +0 −388 mloop/localsklearn/utils/multiclass.py
- +0 −8,576 mloop/localsklearn/utils/murmurhash.c
- +0 −21 mloop/localsklearn/utils/murmurhash.pxd
- +0 −131 mloop/localsklearn/utils/murmurhash.pyx
- +0 −204 mloop/localsklearn/utils/optimize.py
- +0 −288 mloop/localsklearn/utils/random.py
- +0 −9,505 mloop/localsklearn/utils/seq_dataset.c
- +0 −51 mloop/localsklearn/utils/seq_dataset.pxd
- +0 −300 mloop/localsklearn/utils/seq_dataset.pyx
- +0 −84 mloop/localsklearn/utils/setup.py
- +0 −471 mloop/localsklearn/utils/sparsefuncs.py
- +0 −36,213 mloop/localsklearn/utils/sparsefuncs_fast.c
- +0 −423 mloop/localsklearn/utils/sparsefuncs_fast.pyx
- +0 −1 mloop/localsklearn/utils/sparsetools/README
- +0 −5 mloop/localsklearn/utils/sparsetools/__init__.py
- +0 −11,547 mloop/localsklearn/utils/sparsetools/_graph_tools.c
- +0 −460 mloop/localsklearn/utils/sparsetools/_graph_tools.pyx
- +0 −58 mloop/localsklearn/utils/sparsetools/_graph_validation.py
- +0 −13,792 mloop/localsklearn/utils/sparsetools/_traversal.c
- +0 −748 mloop/localsklearn/utils/sparsetools/_traversal.pyx
- +0 −26 mloop/localsklearn/utils/sparsetools/setup.py
- +0 −346 mloop/localsklearn/utils/src/MurmurHash3.cpp
- +0 −45 mloop/localsklearn/utils/src/MurmurHash3.h
- +0 −76 mloop/localsklearn/utils/src/cholesky_delete.h
- +0 −155 mloop/localsklearn/utils/src/gamma.c
- +0 −8 mloop/localsklearn/utils/src/gamma.h
- +0 −59 mloop/localsklearn/utils/stats.py
- +0 −799 mloop/localsklearn/utils/testing.py
- +0 −706 mloop/localsklearn/utils/validation.py
- +0 −7,081 mloop/localsklearn/utils/weight_vector.c
- +0 −29 mloop/localsklearn/utils/weight_vector.pxd
- +0 −199 mloop/localsklearn/utils/weight_vector.pyx
- +45 −0 mloop/utilities.py
- +118 −7 mloop/visualizations.py
- +8 −0 requirements.txt
- +13 −3 setup.py
- +21 −0 tests/shell_script.py
- +58 −8 tests/test_examples.py
- +80 −0 tests/test_units.py
View
11
.travis.yml
| @@ -1,14 +1,15 @@ | ||
| language: python | ||
| python: | ||
| - - "2.7" | ||
| - - "3.4" | ||
| + - "2.7.12" | ||
| - "3.5" | ||
| -install: | ||
| - - pip install . | ||
| +install: | ||
| + - pip install --upgrade pip | ||
| + - python --version | ||
| + - pip --version | ||
| + - pip install -r requirements.txt | ||
| # command to run tests | ||
| script: python setup.py test | ||
| os: | ||
| - linux | ||
| - - osx | ||
View
2
docs/api/controllers.rst
| @@ -1,7 +1,7 @@ | ||
| .. _api-controllers: | ||
| controllers | ||
| ------------ | ||
| +=========== | ||
| .. automodule:: mloop.controllers | ||
| :members: | ||
View
1
docs/api/index.rst
| @@ -1,5 +1,6 @@ | ||
| .. _sec-api: | ||
| +========== | ||
| M-LOOP API | ||
| ========== | ||
View
2
docs/api/interfaces.rst
| @@ -1,5 +1,5 @@ | ||
| interfaces | ||
| ----------- | ||
| +========== | ||
| .. automodule:: mloop.interfaces | ||
| :members: | ||
View
2
docs/api/launchers.rst
| @@ -1,5 +1,5 @@ | ||
| launchers | ||
| ---------- | ||
| +========= | ||
| .. automodule:: mloop.launchers | ||
| :members: | ||
View
2
docs/api/learners.rst
| @@ -1,7 +1,7 @@ | ||
| .. _api-learners: | ||
| learners | ||
| ---------- | ||
| +======== | ||
| .. automodule:: mloop.learners | ||
| :members: | ||
View
2
docs/api/mloop.rst
| @@ -1,4 +1,4 @@ | ||
| mloop | ||
| ------ | ||
| +===== | ||
| .. automodule:: mloop |
View
2
docs/api/t_esting.rst
| @@ -1,5 +1,5 @@ | ||
| testing | ||
| -------- | ||
| +======= | ||
| .. automodule:: mloop.testing | ||
| :members: | ||
View
2
docs/api/utilities.rst
| @@ -1,5 +1,5 @@ | ||
| utilities | ||
| ---------- | ||
| +========= | ||
| .. automodule:: mloop.utilities | ||
| :members: | ||
View
2
docs/api/visualizations.rst
| @@ -1,5 +1,5 @@ | ||
| visualizations | ||
| --------------- | ||
| +============== | ||
| .. automodule:: mloop.visualizations | ||
| :members: | ||
View
4
docs/conf.py
| @@ -70,9 +70,9 @@ | ||
| # built documents. | ||
| # | ||
| # The short X.Y version. | ||
| -version = '2.0' | ||
| +version = '2.1' | ||
| # The full version, including alpha/beta/rc tags. | ||
| -release = '2.0.2' | ||
| +release = '2.1.0' | ||
| # The language for content autogenerated by Sphinx. Refer to documentation | ||
| # for a list of supported languages. | ||
View
11
docs/contributing.rst
| @@ -1,19 +1,20 @@ | ||
| .. _sec-contributing: | ||
| +============ | ||
| Contributing | ||
| ============ | ||
| If you use M-LOOP please consider contributing to the project. There are many quick and easy ways to help out. | ||
| -- If you use M-LOOP be sure to cite paper where it first used: `'Fast machine-learning online optimization of ultra-cold-atom experiments', Sci Rep 6, 25890 (2016) <http://www.nature.com/srep/>`_. | ||
| -- Star and watch the `M-LOOP github <https://github.com/michaelhush/M-LOOP/watchers>`_. | ||
| -- Make a suggestion on what features you would like added, or report an issue, on the `github <https://github.com/michaelhush/M-LOOP/watchers>`_ or by `email <mailto:MichaelRHush@gmail.com>`_. | ||
| -- Contribute your own code to the `M-LOOP github <https://github.com/michaelhush/M-LOOP/watchers>`_, this could be the interface you designed, more options or a completely new solver. | ||
| +- If you use M-LOOP be sure to cite the paper where it first used: `'Fast machine-learning online optimization of ultra-cold-atom experiments', Sci Rep 6, 25890 (2016) <http://www.nature.com/srep/>`_. | ||
| +- Star and watch the `M-LOOP GitHub <https://github.com/michaelhush/M-LOOP/watchers>`_. | ||
| +- Make a suggestion on what features you would like added, or report an issue, on the `GitHub <https://github.com/michaelhush/M-LOOP/watchers>`_ or by `email <mailto:MichaelRHush@gmail.com>`_. | ||
| +- Contribute your own code to the `M-LOOP GitHub <https://github.com/michaelhush/M-LOOP/watchers>`_, this could be the interface you designed, more options or a completely new solver. | ||
| Finally spread the word! Let others know the success you have had with M-LOOP and recommend they try it too. | ||
| Contributors | ||
| ------------- | ||
| +============ | ||
| M-LOOP is written and maintained by `Michael R Hush <http://www.michaelhush.com>`_ <MichaelRHush@gmail.com> | ||
View
7
docs/data.rst
| @@ -1,12 +1,13 @@ | ||
| .. _sec-data: | ||
| +==== | ||
| Data | ||
| ==== | ||
| M-LOOP saves all data produced by the experiment in archives which are saved to disk during and after the optimization run. The archives also contain information derived from the data, including the machine learning model for how the experiment works. Here we explain how to interpret the file archives. | ||
| File Formats | ||
| ------------- | ||
| +============ | ||
| M-LOOP currently supports three file formats for all file input and output. | ||
| @@ -15,7 +16,7 @@ M-LOOP currently supports three file formats for all file input and output. | ||
| - 'pkl' pickle files: a serialization of a python dictionary made with `pickle <https://docs.python.org/3/library/pickle.html>`. Your data can be retrieved from this dictionary using the appropriate keywords. | ||
| File Keywords | ||
| -------------- | ||
| +============= | ||
| The archives contain a set of keywords/variable names with associated data. The quickest way to understand what the values mean for a particular keyword is to :ref:`search` the documentation for a description. | ||
| @@ -26,7 +27,7 @@ For the controller archive see :ref:`api-controllers`. | ||
| For the learner archive see :ref:`api-learners`. The generic keywords are described in the class Learner, with learner specific options described in the derived classes, for example GaussianProcessLearner. | ||
| Converting files | ||
| ----------------- | ||
| +================ | ||
| If for whatever reason you want to convert files between the formats you can do so using the utilities module of M-LOOP. For example the following python code will convert the file controller_archive_2016-08-18_12-18.pkl from a 'pkl' file to a 'mat' file:: | ||
View
48
docs/examples.rst
| @@ -1,5 +1,6 @@ | ||
| .. _sec-examples: | ||
| +======== | ||
| Examples | ||
| ======== | ||
| @@ -10,24 +11,33 @@ The options available are also comprehensively documented in the :ref:`sec-api` | ||
| Each of the example files is used when running tests of M-LOOP. So please copy and modify them elsewhere if you use them as a starting point for your configuration file. | ||
| Interfaces | ||
| ----------- | ||
| +========== | ||
| -There is currently one interface supported: 'file'. You can specify which interface you want with the option:: | ||
| +There are currently two interfaces supported: 'file' and 'shell'. You can specify which interface you want with the option:: | ||
| interface_type = [name] | ||
| The default will be 'file'. The specific options for each of the interfaces are described below. | ||
| File Interface | ||
| -~~~~~~~~~~~~~~ | ||
| +-------------- | ||
| -You can change the names of the files used for the file interface and their type. The file interface options are described in *file_interface_config.txt*. | ||
| +The file interface exchanges information with the experiment by writing files to disk. You can change the names of the files used for the file interface and their type. The file interface options are described in *file_interface_config.txt*. | ||
| .. include:: ../examples/file_interface_config.txt | ||
| :literal: | ||
| +Shell Interface | ||
| +--------------- | ||
| + | ||
| +The shell interface is for experiments that can be run through a command executed in a shell. Information is then piped between M-LOOP and the experiment through the shell. You can change the command to run the experiment and the way the parameters are formatted. The shell interface options are described in *shell_interface_config.txt* | ||
| + | ||
| +.. include:: ../examples/shell_interface_config.txt | ||
| + :literal: | ||
| + | ||
| + | ||
| Controllers | ||
| ------------ | ||
| +=========== | ||
| There are currently three controller types supported: 'gaussian_process', 'random' and 'nelder_mead'. The default is 'gaussian_process'. You can set which interface you want to use with the option:: | ||
| @@ -38,8 +48,8 @@ Each of the controllers and their specific options are described below. There is | ||
| .. include:: ../examples/controller_config.txt | ||
| :literal: | ||
| -Gaussian Process | ||
| -~~~~~~~~~~~~~~~~ | ||
| +Gaussian process | ||
| +---------------- | ||
| The Gaussian-process controller is the default controller and is the currently the most sophisticated machine learner algorithm. It uses a `Link Gaussian process <http://scikit-learn.org/dev/modules/gaussian_process.html>`_ to develop a model for how the parameters relate to the measured cost, effectively creating a model for how the experiment operates. This model is then used when picking new points to test. | ||
| @@ -52,9 +62,25 @@ There are two example files for the Gaussian-process controller: *gaussian_proce | ||
| .. include:: ../examples/gaussian_process_complete_config.txt | ||
| :literal: | ||
| + | ||
| +Differential evolution | ||
| +---------------------- | ||
| + | ||
| +The differential evolution (DE) controller uses a `Link DE algorithm <https://en.wikipedia.org/wiki/Differential_evolution>`_ for optimization. DE is a type of evolutionary algorithm, and is historically the most commonly used in automated optimization. DE will eventually find a global solution, however it can take many experiments before it does so. | ||
| + | ||
| +There are two example files for the differential evolution controller: *differential_evolution_simple_config.txt* which contains the basic options. | ||
| + | ||
| +.. include:: ../examples/differential_evolution_simple_config.txt | ||
| + :literal: | ||
| + | ||
| +*differential_evolution_complete_config.txt* which contains a comprehensive list of options. | ||
| + | ||
| +.. include:: ../examples/differential_evolution_complete_config.txt | ||
| + :literal: | ||
| + | ||
| Nelder Mead | ||
| -~~~~~~~~~~~ | ||
| +----------- | ||
| The Nelder Mead controller implements the `Link Nelder-Mead method <https://en.wikipedia.org/wiki/Nelder%E2%80%93Mead_method>`_ for optimization. You can control the starting point and size of the initial simplex of the method with the configuration file. | ||
| @@ -69,7 +95,7 @@ There are two example files for the Nelder-Mead controller: *nelder_mead_simple_ | ||
| :literal: | ||
| Random | ||
| -~~~~~~ | ||
| +------ | ||
| The random optimization algorithm picks parameters randomly from a uniform distribution from within the parameter bounds or trust region. | ||
| @@ -84,15 +110,15 @@ There are two example files for the random controller: *random_simple_config.txt | ||
| :literal: | ||
| Logging | ||
| -------- | ||
| +======= | ||
| You can control the filename of the logs and also the level which is reported to the file and the console. For more information see `Link logging levels <https://docs.python.org/3.6/library/logging.html#levels>`_. The logging options are described in *logging_config.txt*. | ||
| .. include:: ../examples/logging_config.txt | ||
| :literal: | ||
| Extras | ||
| ------- | ||
| +====== | ||
| Extras refers to options related to post processing your data once the optimization is complete. Currently the only extra option is for visualizations. The extra options are described in *extras_config.txt*. | ||
View
17
docs/index.rst
| @@ -1,6 +1,6 @@ | ||
| -###### | ||
| +====== | ||
| M-LOOP | ||
| -###### | ||
| +====== | ||
| The Machine-Learner Online Optimization Package is designed to automatically and rapidly optimize the parameters of a scientific experiment or computer controller system. | ||
| @@ -13,25 +13,24 @@ Using M-LOOP is simple, once the parameters of your experiment is computer contr | ||
| M-LOOP not only finds an optimal set of parameters for the experiment it also provides a model of how the parameters are related to the costs which can be used to improve the experiment. | ||
| -If you use M-LOOP please cite our publication where we first used the package to optimise the production of a Bose-Einstein Condensate: | ||
| +If you use M-LOOP please cite our publication where we first used the package to optimize the production of a Bose-Einstein Condensate: | ||
| Fast Machine-Learning Online Optimization of Ultra-Cold-Atom Experiments. *Scientific Reports* **6**, 25890 (2016). DOI: `Link 10.1038/srep25890 <http://dx.doi.org/10.1038/srep25890>`_ | ||
| http://www.nature.com/articles/srep25890 | ||
| Quick Start | ||
| ------------ | ||
| +=========== | ||
| -To get the M-LOOP running as soon as possible follow the :ref:`sec-installation` instructions and :ref:`sec-tutorial`. | ||
| +To get M-LOOP running follow the :ref:`sec-installation` instructions and :ref:`sec-tutorial`. | ||
| Contents | ||
| --------- | ||
| +======== | ||
| .. toctree:: | ||
| - :maxdepth: 2 | ||
| install | ||
| - tutorial | ||
| + tutorials | ||
| interfaces | ||
| data | ||
| visualizations | ||
| @@ -40,7 +39,7 @@ Contents | ||
| api/index | ||
| Indices | ||
| -------- | ||
| +======= | ||
| * :ref:`genindex` | ||
| * :ref:`modindex` | ||
Oops, something went wrong.