Streamlined machine learning experiment management.
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
assets
bin
examples
randopt
test
wiki @ 770ed80
.gitignore Created Visualization functionality. Also updated .gitignore to ignor… Oct 30, 2016
.gitmodules
.travis.yml Added py3.6, py3.7-dev oto travis test Apr 11, 2018
CONTRIBUTING.md
LICENSE.txt
Makefile
README.md Added missing links Sep 11, 2018
_config.yml
gendocs.py
setup.cfg
setup.py Version bump Dec 6, 2018

README.md


Build Status PyPI version

randopt is a Python package for machine learning experiment management, hyper-parameter optimization, and results visualization. Some of its features include:

  • result logging and management,
  • human-readable format,
  • support for parallelism / distributed / asynchronous experiments,
  • command-line and programmatic API,
  • shareable, flexible Web visualization,
  • automatic hyper-parameter search, and
  • pure Python - no dependencies !

Installation

pip install randopt

Usage

import randopt as ro

def loss(x):
    return x**2

e = ro.Experiment('myexp', {
        'alpha': ro.Gaussian(mean=0.0, std=1.0, dtype='float'),
    })

# Sampling parameters
for i in xrange(100):
    e.sample('alpha')
    res = loss(e.alpha)
    print('Result: ', res)
    e.add_result(res)

# Manually setting parameters
e.alpha = 0.00001
res = loss(e.alpha)
e.add_result(res)

# Search over all experiments results, including ones from previous runs
opt = e.minimum()
print('Best result: ', opt.result, ' with params: ', opt.params)

Results Visualization

Once you obtained some results, run roviz.py path/to/experiment/folder to visualize them in your web browser.

For more info on visualization and roviz.py, refer to the Visualizing Results tutorial.

Hyper-Parameter Optimization

To generate results and search for good hyper-parameters you can either user ropt.py or write your own optimizaiton script using the Evolutionary and GridSearch classes.

For more info on hyper-parameter optimization, refer to the Optimizing Hyperparams tutorial.

Documentation

For more examples, tutorials, and documentation refer to the wiki.

Contributing

To contribute to Randopt, it is recommended to follow the contribution guidelines.

Acknowledgements

Randopt is maintained by Séb Arnold, with numerous contributions from the following persons.

  • Noel Trivedi
  • Cyrus Jia
  • Daler Asrorov

License

Randopt is released under the Apache 2 License. For more information, refer to the LICENSE file.

I would love to hear how your use Randopt. Feel free to drop me a line !