Skip to content
master
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
fig
 
 
 
 
 
 
 
 
 
 

Color Theory App

Description

A toy project/demo on how to structure and develop micro-service driven application powered by machine learning service.

Motivation

Since 2019-04 till 2019-06, only 5 out of over 630 articles about ML services deployment were published on towardsdatascience.com. This figures reflects the situation in the data science community; only a small fraction of engineers/data scientist are capable to deliver ML product into production.

App Idea

Built a web app to answer the question "What's the color?" Assuming that it's a point in RGB, or HEX space with a name and type.

App Classical Service Features

Display the color name, RGB and HEX codes on the user input.

ML Service Feature

The app objective is to define binary category of a color selected by user. Two possible categories being warm and cool are described as following:

Color theory has described perceptual and psychological effects to this contrast. Warm colors are said to advance or appear more active in a painting, while cool colors tend to recede; used in interior design or fashion, warm colors are said to arouse or stimulate the viewer, while cool colors calm and relax.

App Structure

color_theory_app
    ├── server
    ├── client
    ├── docker-compose.yaml
    └── launch_services.sh

The app has two service sides, frontend and backend:

  • client/frontend side can be generalised as the product with developers + DevOps maintaining and developing it
  • server/backend side can be generalised as the micro-service with data scientist/engineers + machine learning engineers + developers + Dev-/DataOps maintaining and developing it

Frontend/product communicates to the backend (e.g. machine-learning service) to provide users unique feature, or improve users experience.

Backend

Backend has the interface(s) to communicate with other services (fronted service in our case) with a set of end-points.

server
    ├── common_libs
    ├── color_name
    └── color_type

The app's server side has two micro-services with HTTP interfaces (API end-points):

  • color_name - provides color name
  • color_type - provides color binary class/type

Both services API's have similar code structure:

service
  └── api
       ├── sub-mobules
       │    ├── __init__.py
       │    ├── ...
       │    └── module.py
       ├── Dockerfile
       ├── requirements.txt
       └── run_server.py

With service runner/executable, run_server.py.

Base Service - Color Name

The service requires an input color as the HEX, or RGB code and returns the color name. The color name of a given color is identified as the reference color name based on the Euclidian distance between an input color and the reference in-memory colors data set:

  1. Array of euclidian distances between input color and reference colors is calculated in RGB space
  2. Minimum value of the distance is identified
  3. The Reference color name which corresponds to the found reference color is assigned to the input color:

The service delivers the product feature F2 (see infrastructure).

ML Service - Color Type

color_type
  ├── Dockerfile
  ├── api
  └── ml
      ├── model
      └── train

The machine learning service consumes the model from ./color_type/ml/model/ and predicts a color class/type based on the input color code:

model.predict(pandas.DataFrame({'r': [r_in],
                                'g': [g_in],
                                'b': [b_in]}))

The service delivers the product feature F3 (see infrastructure).

Machine Learning Model Development

ml
├── model
│   ├── v1
│   │   └── model_v1.sav
│   └── v2
│       └── model_v2.sav
└── train
    ├── data
    │   └── warm_cold_colors.csv
    └── ml_steps.ipynb

The models can be iteratively developed by the data science team according to the flow:

consume data from data dir -> train and evaluate the model -> model export into model dir -> model quality monitoring -> model re-train

API: Endpoints Contract

Back-end Service Endpoint Response: 200 Response: 500
Base Service/
Color Name
     /name/hex?hexcode=5C77FA
    
     /name/rgb?r=92&g=119&b=250
    
    {
      "data": {
          "color": {
               "r": 92,
               "g": 119,
               "b": 250
           },
      "name": "Blueberry"
     }
    }
    
    { "data": null }
    
ML Service/
Color Type
     /type/hex?hexcode=5C77FA
    
     /type/rgb?r=92&g=119&b=250
    
    {
      "data": {
          "color": {
               "r": 92,
               "g": 119,
               "b": 250
          },
          "is_warm": 1
     }
    }
    

Frontend/Client Side

Frontend service of the app gives web interface for a user to select a color of interest and define its properties by communicating with the backend via its API end-point(s). It also delivers feature F1 (see infrastructure).

App UI + Infrastructure

app_infra

App Run

Requirements

docker ver. >= 18.09.2
docker-compose ver. >= 1.23.2

Installation

Docker:

App Launch

Docker-compose:

To launch the app, clone the repo

git clone git@github.com:kislerdm/color_theory_app.git && cd color_theory_app

and build&run docker images with services and the client app

sh launch_services.sh

Upon docker images build completion and when docker containers are up and running, you can access UI by going to http://localhost:10000 in your browser.