Skip to content

Easily Deploy your Tensorflow models to Heroku with just the click of a button!

Notifications You must be signed in to change notification settings

JesuFemi-O/Cruise

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

60 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cruise 🚀

Automatic deployment of tensorflow models as rest apis to heroku using tensorflow serving.

Try it out.

Deploy

How to use:

  1. Save your Tensorflow model using Tensorflow SaveModel API

  2. Create a model cofiguration file in the model base path

  3. zip the model in tar.gz format and upload to a public aws bucket

  4. Click the deploy button on this repo to deploy the model

Visit the documentation for a detailed guide.

How to test it

on clicking the deploy button:

you can use this aws bucket url for deployment:

https://cruise-bucket.s3.amazonaws.com/tf-models.tar.gz

set the MODEL_FOLDER_NAME to img_classifier

then deploy on your heroku account.

on successful deployment, navigate to:

https://YOUR-APP-NAME.herokuapp.com/v1/models/YOUR-MODEL-NAME

you should see something simiilar to:

{
 "model_version_status": [
  {
   "version": "1",
   "state": "AVAILABLE",
   "status": {
    "error_code": "OK",
    "error_message": ""
   }
  }
 ]
}

if you used the url above, here's a simple python script you can run on your PC to test the model:

import requests
import json
import numpy as np
from tensorflow.keras.datasets.mnist import load_data


#load MNIST dataset
(_, _), (x_test, y_test) = load_data()

# reshape data to have a single channel
x_test = x_test.reshape((x_test.shape[0], x_test.shape[1], x_test.shape[2], 1))

# normalize pixel values
x_test = x_test.astype('float32') / 255.0


test_img = x_test[0]

YOUR_APP_NAME = "the-name-of-your-heroku-app"
url = f'https://{YOUR_APP_NAME}.herokuapp.com/v1/models/img_classifier:predict'


def make_prediction(instances, many=False):
    if not many:
        data = json.dumps({"signature_name": "serving_default", "instances": [instances.tolist()]})
    else:
        data = json.dumps({"signature_name": "serving_default", "instances": instances.tolist()})
    headers = {"content-type": "application/json"}
    json_response = requests.post(url, data=data, headers=headers)
    predictions = json.loads(json_response.text)['predictions']
    return predictions


for p in make_prediction(test_img):
    print(np.argmax(p))

Features:

The end goal is to allow you effortlessly deploy your Tensorflow models to heroku and test it easily.

  • allow users deploy multiple versions of their models to heroku with the click of a button

  • allow users to easily extend their server to serve multiple Tensorflow models

References: