Skip to content
Optimization Examples with SigOpt
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
amazon-ml-hyperparameter-optimization add project field to experiment create Mar 1, 2019
caffe2-cnn removing "About SigOpt" for consistency Apr 9, 2019
classifier
conditionals
constraints
dnn-tuning-nervana
dnn-tuning-nvidia-mxnet add project field to experiment create Mar 1, 2019
estimated-training-time add project field to experiment create Mar 1, 2019
java Updated README and requirements files (#91) Apr 12, 2018
jupyter-notebook-example add project field to experiment create Mar 1, 2019
maze-solving-mouse
multimetric-demo add project field to experiment create Mar 1, 2019
multimetric-timeseries
optimizing-memn2n Update some urllib3 requirements Apr 19, 2019
orchestrate rename repository to image in orchestrate config (#121) Apr 17, 2019
other-languages tokens/info => tokens Feb 10, 2018
parallel
random-forest update version Apr 19, 2019
reinforcement-learning add project field to experiment create Mar 1, 2019
sigopt-beats-vegas Removing "About SigOpt" for consistency Apr 9, 2019
spark/recommender_sys tokens/info => tokens Feb 10, 2018
stanford-car-classification Merge branch 'master' into update-reqs Apr 19, 2019
tensorflow-cnn add project field to experiment create Mar 1, 2019
test add project field to experiment create Mar 1, 2019
text-classifier add project field to experiment create Mar 1, 2019
unsupervised-model add project field to experiment create Mar 1, 2019
xgboost-classifier assigments -> assignments (#115) Mar 4, 2019
.gitignore removed .DS_Store Mar 31, 2017
.travis.yml
LICENSE optimizing memory networks and including MIT liscence for the repo Mar 20, 2019
README.md Updated README and requirements files (#91) Apr 12, 2018

README.md

image

Getting Started with SigOpt

Welcome to the SigOpt Examples. These examples show you how to use SigOpt for model tuning tasks in various machine learning environments.

Requirements

Most of these examples will run on any Linux or Mac OS X machine from the command line. Each example contains a README.md with specific setup instructions.

First Time?

If this is your first time using SigOpt, we recommend you work through the Random Forest example. In this example, you will use a random forest to classify data from the iris dataset and use SigOpt to maximize the k-fold cross-validation accuracy by tuning the model's hyperparameters. This example is available in a wide variety of languages and integrations:

More Examples

Questions?

Any questions? Drop us a line at support@sigopt.com.

API Reference

To implement SigOpt for your use case, feel free to use or extend the code in this repository. Our core API can bolt on top of any complex model or process and guide it to its optimal configuration in as few iterations as possible.

About SigOpt

With SigOpt, data scientists and machine learning engineers can build better models with less trial and error.

Machine learning models depend on hyperparameters that trade off bias/variance and other key outcomes. SigOpt provides Bayesian hyperparameter optimization using an ensemble of the latest research.

SigOpt can tune any machine learning model, including popular techniques like gradient boosting, deep neural networks, and support vector machines. SigOpt’s REST API and client libraries (Python, R, Java) integrate into any existing ML workflow.

SigOpt augments your existing model training pipeline, suggesting parameter configurations to maximize any online or offline objective, such as AUC ROC, model accuracy, or revenue. You only send SigOpt your metadata, not the underlying training data or model.

SigOpt is available through Starter, Workgroup, and Enterprise plans, and is free forever for academic users.

You can’t perform that action at this time.