Skip to content
An automatic ML model optimization tool.
Python Shell
Branch: master
Clone or download
LGE-ARC-AdvancedAI Update .travis.yml
travis build on master branch only.
Latest commit 698450a Nov 9, 2019

README.md

Auptimizer Logo

Documentation GPL 3.0 License pipeline status coverage report

Auptimizer is an optimization tool for Machine Learning (ML) that automates many of the tedious parts of the model building process. Currently, Auptimizer helps with:

  • Automating tedious experimentation - Start using Auptimizer by changing just a few lines of your code. It will run and record sophisticated hyperparameter optimization (HPO) experiments for you, resulting in effortless consistency and reproducibility.

  • Making the best use of your compute-resources - Whether you are using a couple of GPUs or AWS, Auptimizer will help you orchestrate compute resources for faster hyperparameter tuning.

  • Getting the best models in minimum time - Generate optimal models and achieve better performance by employing state-of-the-art HPO techniques. Auptimizer provides a single seamless access point to top-notch HPO algorithms, including Bayesian optimization, multi-armed bandit. You can even integrate your own proprietary solution.

Best of all, Auptimizer offers a consistent interface that allows users to switch between different HPO algorithms and computing resources with minimal changes to their existing code.

In the future, Auptimizer will support end-to-end model building for edge devices, including model compression and neural architecture search. The table below shows a full list of currently supported techniques.

Supported HPO Algorithms Supported Infrastructure
Random
Grid
Hyperband
Hyperopt
Spearmint
EAS (experimental)
Passive
Multiple CPUs
Multiple GPUs
Multiple Machines (SSH)
AWS EC2 instances

Install

Auptimizer currently is well tested on Linux systems, it may require some tweaks for Windows users.

pip install auptimizer

Note Dependencies are not included. Using pip install requirements.txt will install necessary libraries for all functionalities.

Documentation

See more in documentation

Example

cd Examples/demo
# Setup environment (Interactively create the environment file based on user input)
python -m aup.setup
# Setup experiment
python -m aup.init
# Create training script - auto.py
python -m aup.convert origin.py experiment.json demo_func
# Run aup for this experiment
python -m aup experiment.json

Each job's hyperparameter configuration is saved separately under jobs/*.json and is also recorded in the SQLite file .aup/sqlite3.db.

gif demo

More examples are under Examples.

License

GPL 3.0 License

Cite

If you have used this software for research, please cite the following paper (accepted at IEEE Big Data 2019):

@misc{liu2019auptimizer,
    title={Auptimizer -- an Extensible, Open-Source Framework for Hyperparameter Tuning},
    author={Jiayi Liu and Samarth Tripathi and Unmesh Kurup and Mohak Shah},
    year={2019},
    eprint={1911.02522},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
You can’t perform that action at this time.