Skip to content

ML Prediction API using LDA and NN for EEG Data

License

Notifications You must be signed in to change notification settings

gangodu/py-flask-docker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ML Prediction API using LDA and NN for EEG Dataset

 Built in Python with Flask for APIs & Deployed on Docker 

TLDR; {Build rightaway}

  • Download this repo
    • api.py serves the API routes from Python-Flask application
    • train.py defines, builds and creates both NN and Classification Models and related model files
    • requirements.txt is a standard external packages dependency file for Python projects basically lists all 3rd party packages we use in the app
    • Dockerfile contains build, directory and environment info. Used to build an image of the total ML Prediction App that can be deployed as a container into any service.

Build Docker Image

docker build -t py-api -f Dockerfile .

Run built Docker Image in a Docker Container

docker run -it -p 5000:5000 py-api python3 api.py


SOME CONTEXT

Goals

  1. Build ML models to serve classifications and predictions using Flask API in Real Tine.
  2. Deploy the model, package requirements for Python on a Docker Image
  3. Test the image deployed on a container in localhost and on actual host on ports 5000

Test Inputs

  1. test.json: 1300 rows of EEG data; 160 features[columns]; Used to the test models
  2. train.csv: Partial data to train the models

APIs with Desired Outcomes

  • Data Extraction

    Input: Row number
    Output: Extract data and printed out to the console
    Test Link: http://127.0.0.1:5000/line/{lineNumber}
  • Results from both models

    Input: Row number
    Process: Extract the selected row, inject new data into pre-trained and ddeployed models
    Output: etrieve the classification prediction (Letter variable in the data)
    Test Link : http://127.0.0.1:5000/prediction/{lineNumber}
  • Real-time Model Confidence Scores

    Input: None
    Process: Read all data from the local file {test.json}
    Output: Print classification score of both the models.

    Test Link: http://127.0.0.1:5000/score


Nice to know!

  • I used the idea from an article that outlined how Python-Flask-ML can be built and deployed on Docker
  • Entirely coded, tested and deployed in Github Codespaces
  • Output of ML may have extreme errors, which is not our focus, as we look to learn:
    1. How to develop a ML model on Python
    2. Make it accessible via Flask
    3. Package it as a Docker Image
    4. Deploy the built image as a container
  • I did this in Github codespaces so had to test with in-browser terminal, outside http was not accsesible
  • In the PORTS tab [Next to TERMINAL tab in Codespaces], you can set the port to be exposed to public but not guaranteed of server access
  • Major packages used:
    1. scikit-learn
    2. Flask
    3. Numpy

TODO

  • Model improvement
  • Faster docker image build
  • Smaller Image Layers
  • Remove JSON based data storage
  • Add DB access
  • Check network, security, monitoring, infrastructure, orchestration in real PROD apps