Open source platform for the machine learning lifecycle
Clone or download
dbczumar [Default envs] Add docker/sagemaker integration tests with default co…
…nda env for each flavor (#710)

* Env changes

* Test updates

* Test tweaks

* Test fix spark

* testfix

* Spark support

* Model utils tests

* Remove unused azml import

* Lint fixes

* Keras support

* Lint

* Remove unused function

* Address comments

* Address comments

* Make yaml changes

* Torch

* Test fix

* Test fixes

* Tensorflow

* Test fixes

* Test fix

* Lint fixes

* Fix whitespace

* Fix child process handling in scoring process tests

* Sagemaker test fix

* Fix sklearn model read

* Sklearn integration test

* Tensorflow test

* pytorch

* Pytorch fixes

* Add h2o test

* Keras

* Keras file fix

* Release tag

* Default timeout

* Test fix

* reduce pytorch test time

* Fix pyspark test
Latest commit 4a17a43 Nov 13, 2018
Failed to load latest commit information.
docs Consistent model headings, fix formatting, fix rubric style, add loca… Nov 13, 2018
examples fixing typo in readme (#703) Nov 9, 2018
mlflow [Default envs] Add default conda environments for H2O, Keras (#708) Nov 13, 2018
tests [Default envs] Add docker/sagemaker integration tests with default co… Nov 14, 2018
.dockerignore Add missing files (.dockerignore, .travis.yml) (#12) Jun 7, 2018
.gitignore Debug slow pip installations breaking our Travis CI builds (#515) Sep 19, 2018
.travis.yml Debug slow pip installations breaking our Travis CI builds (#515) Sep 19, 2018
CHANGELOG.rst Update release version to 0.8.0 (#699) Nov 10, 2018
CONTRIBUTING.rst Add docs on contributing to the Java and R APIs (#496) Sep 18, 2018
Dockerfile Fix Dockerfile for local development (#411) Aug 30, 2018 Recommend a simpler command to get version (#107) Jul 4, 2018
LICENSE.txt Update LICENSE.txt (#29) Jun 8, 2018
README.rst Mark MLflow as beta (#711) Nov 13, 2018 [Default envs] Add docker/sagemaker integration tests with default co… Nov 14, 2018
dev-requirements.txt Fix sphinx version to 1.7.9 to fix travis builds (#481) Sep 13, 2018 Java SDK for MLflow (#380) Aug 28, 2018 Log the run page URL as a tag when running on Databricks (#388) Aug 29, 2018
pylintrc Updates to Projects API (#82) Jul 18, 2018 Add boto3 version dependency (#470) Sep 11, 2018 Include protobuf dependencies and use C implementation. (#74) Jun 27, 2018
test-requirements.txt Model deployment: Support for Azure ML SDK (#631) Oct 24, 2018


MLflow Beta Release

Note: The current version of MLflow is a beta release. This means that APIs and data formats are subject to change!

Note 2: We do not currently support running MLflow on Windows. Despite this, we would appreciate any contributions to make MLflow work better on Windows.


Install MLflow from PyPi via pip install mlflow

MLflow requires conda to be on the PATH for the projects feature.

Nightly snapshots of MLflow master are also available here.


Official documentation for MLflow can be found at


To discuss MLflow or get help, please subscribe to our mailing list ( or join us on Slack at

To report bugs, please use GitHub issues.

Running a Sample App With the Tracking API

The programs in examples use the MLflow Tracking API. For instance, run:

python examples/quickstart/

This program will use MLflow Tracking API, which logs tracking data in ./mlruns. This can then be viewed with the Tracking UI.

Launching the Tracking UI

The MLflow Tracking UI will show runs logged in ./mlruns at http://localhost:5000. Start it with:

mlflow ui

Note: Running mlflow ui from within a clone of MLflow is not recommended - doing so will run the dev UI from source. We recommend running the UI from a different working directory, using the --file-store option to specify which log directory to run against. Alternatively, see instructions for running the dev UI in the contributor guide.

Running a Project from a URI

The mlflow run command lets you run a project packaged with a MLproject file from a local path or a Git URI:

mlflow run examples/sklearn_elasticnet_wine -P alpha=0.4

mlflow run -P alpha=0.4

See examples/sklearn_elasticnet_wine for a sample project with an MLproject file.

Saving and Serving Models

To illustrate managing models, the mlflow.sklearn package can log scikit-learn models as MLflow artifacts and then load them again for serving. There is an example training application in examples/sklearn_logisitic_regression/ that you can run as follows:

$ python examples/sklearn_logisitic_regression/
Score: 0.666
Model saved in run <run-id>

$ mlflow sklearn serve -r <run-id> -m model

$ curl -d '[{"x": 1}, {"x": -1}]' -H 'Content-Type: application/json' -X POST localhost:5000/invocations


We happily welcome contributions to MLflow. Please see our contribution guide for details.