Skip to content
Fast, easy machine learning deployment & collaboration for humans
Python Shell Other
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
.circleci Build PRs against current master on CI (#45) Oct 30, 2019
docs enable context/context, bucket/bucket promotion (#35) Oct 3, 2019
omegaml Create mongoshim that updates MongoClient with additional SSL kwargs (#… Nov 15, 2019
scripts Add Livetest step to CircleCI configuration. (#37) Oct 3, 2019
.gitignore make tz testing static in time, remark on limitted tz dst support (#44) Oct 26, 2019
.noseids commit to publish Dec 10, 2018
Dockerfile updates (#28) Aug 30, 2019
FAQ.rst update README, tag line, bump version Apr 29, 2019
LICENSE commit to publish Dec 10, 2018 added livetest installation using docker May 2, 2019
Makefile enable keras models, including tpu support (#15) Aug 21, 2019
NOTICE update license data Dec 13, 2018
Procfile enable notebook directories, sub directories (basic implementation & … Jul 20, 2019 updates (#28) Aug 30, 2019
README.rst Update README.rst Aug 28, 2019
THIRDPARTY commit to publish Dec 10, 2018
THIRDPARTY-LICENSES enable keras models, including tpu support (#15) Aug 21, 2019
conda-requirements.txt upgrade tensorflow to 1.15.0 (#46) Oct 29, 2019
docker-compose-dev.yml enable keras models, including tpu support (#15) Aug 21, 2019
docker-compose.yml enable keras models, including tpu support (#15) Aug 21, 2019
pip-requirements.txt upgrade tensorflow to 1.15.0 (#46) Oct 29, 2019
requirements.txt enable scripts runtime (#38) Oct 4, 2019 upgrade tensorflow to 1.15.0 (#46) Oct 29, 2019
shippable.yml updates (#28) Aug 30, 2019


omega|ml is the fastest way to deploy machine learning models

omega|ml takes just a single line of code to

  • deploy machine learning models straight from Jupyter Notebook (or any other code)
  • implement data pipelines quickly, without memory limitation, all from a Pandas-like API
  • serve models and data from an easy to use REST API

Further, omega|ml is the fastest way to

  • scale model training on the included scalable pure-Python compute cluster, on Spark or any other cloud
  • collaborate on data science projects easily, sharing Jupyter Notebooks
  • deploy beautiful dashboards right from your Jupyter Notebook, using dashserve



Get started in < 5 minutes

Start the omega|ml server right from your laptop or virtual machine

$ wget
$ docker-compose up -d

Jupyter Notebook is immediately available at http://localhost:8899 (omegamlisfun to login). Any notebook you create will automatically be stored in the integrated omega|ml database, making collaboration a breeze. The REST API is available at http://localhost:5000.

Already have a Python environment (e.g. Jupyter Notebook)? Leverage the power of omega|ml by installing as follows:

# assuming you have started the server as per above
$ pip install omegaml


Get more information at

# transparently store Pandas Series and DataFrames or any Python object
om.datasets.put(df, 'stats')
om.datasets.get('stats', sales__gte=100)

# transparently store and get models
clf = LogisticRegression()
om.models.put(clf, 'forecast')
clf = om.models.get('forecast')

# run and scale models directly on the integrated Python or Spark compute cluster
om.runtime.model('forecast').fit('stats[^sales]', 'stats[sales]')
om.runtime.model('forecast').gridsearch(X, Y)

# use the REST API to store and retrieve data, run predictions
requests.put('/v1/dataset/stats', json={...})
requests.put('/v1/model/forecast', json={...})

Use Cases

omega|ml currently supports scikit-learn out of the box. Need to deploy a model from another framework? Open an issue at or drop us a line at

Machine Learning Deployment

  • deploy models to production with a single line of code
  • serve and use models or datasets from a REST API

Data Science Collaboration

  • get a fully integrated data science workplace within minutes [1]
  • easily share models, data, jupyter notebooks and reports with your collaborators

Centralized Data & Compute cluster

  • perform out-of-core computations on a pure-python or Apache Spark compute cluster [2]
  • have a shared NoSQL database, out of the box, that behaves like a Pandas dataframe [3]
  • use a compute cluster to train your models with no additional setup

Scalability and Extensibility

  • scale your data science work from your laptop to team to production with no code changes
  • integrate any machine learning framework or third party data science platform with a common API

Towards Data Science recently published an article on omega|ml:

[1] supporting scikit-learn, Spark MLLib out of the box, Keras and Tensorflow available shortly. Note the Spark integration is currently only available with the enterprise edition. [2] using Celery, Dask Distributed or Spark [3] leveraging MongoDB's excellent aggregation framework

In addition omega|ml provides an easy-to-use extensions API to support any kind of models, compute cluster, database and data source.

Enterprise Edition

omega|ml Enterprise Edition provides security on every level and is ready made for Kubernetes deployment. It is licensed separately for on-premise, private or hybrid cloud. Sign up at

You can’t perform that action at this time.