Skip to content

elefthei/python-ml-microservice

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ML Microservice in Python

Simple uWSGI with Flask web server implementing a Machine Learning REST API.

Introduction

Python obviously has some exceptional ML libraries. To facilitate using them I have written a simple REST API that should work for most ML models, as well as a Docker container around the whole thing.

Usage

Edit app/config.yml to add your AWS credentials.

aws:
  BUCKET : '<bucker name>'
  AWS_ACCESS_KEY_ID : '<ACCESS_KEY_ID here>'
  AWS_SECRET_ACCESS_KEY : '<ACCESS_SECRET_KEY here>'

And to handle each file passed in the POST request, use app/ML.py:

  def handleTrainFile(contents):
    # Dear ML engineer, feel free to handle the training
    # of your model here.
    print 'Received /train file, contents...'
    print contents

REST API

POST /train

Body:

{
  files: [ <S3 file path>, ...]
}

Example:

{
  files: [ USERS/data/1234.csv ]
}

Response:

{
  "jobId": "1"
}

Error: 500

POST /input

Body:

{
  files: [ <S3 file path>, ... ]
}

Response:

{
  "jobId": "1"
}

Error: 500

POST /output/

Body:

{
  files: [ <s3 file path>, ... ]
}

Response:

{
  output: { output vector }
}

Error: 500

GET /status?taskID=


Response:

{ status: [completed|running|cancelled] }


Error: *500*

About

A simple, extensible, docker container and HTTP server for microservice-ing Python ML code.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published