Skip to content

buarki/snapmath

Repository files navigation

snapmath

Project TOC

  1. Just want to run the project;
  2. The app in action;
  3. Design Document;
  4. Jupyter Notebook of model;
  5. App local development (Python);
  6. App local development (React);
  7. Saving a built and trained model;
  8. Exposing the model on Tensorflow serving;
  9. Next steps;
  10. Contribution;

Running from docker

docker run -p 5000:5000 -e MODEL_PATH="./models/1703825980" buarki/snapmath-app

Once running go to http://localhost:5000. You can find some images to use at the ML directory.

App in action

App in action

App in action

App in action

App in action

App in action

App in action

App in action

Running Python app from source code

To develop the app you can follow this steps:

  1. enter the app directory:
cd app
  1. Initialize virtual env:
python3 -m venv app/snapmath
  1. Activate the virtual env:
source snapmath/bin/activate
  1. Install dependencies:
pip3 install -r requirements.txt
  1. To Run the app:
MODEL_PATH="../models/1703825980" python3 -m flask run --host=0.0.0.0

MODEL_PATH is pointing to the saved model 1703825980.

Once running go to http://localhost:5000. You can find some images to use at the ML directory.

Running React app from source code

To develop the app you can follow this steps:

  1. enter the app directory:
cd app-js
  1. Set Node version to 19.0.0:
nvm 19.0.0
  1. Install dependencies:
npm i
  1. Run
npm run dev

Once running go to http://localhost:3000. You can find some images to use at the ML directory.

For more info, see the full project here.

Saving a built and trained mode

You can collect and save your built model by using this script. Just follow bellow steps:

  1. Collect the path of the model inside jupyter notebook. You can do so by running:
docker-compose exec jupyter-notebook sh -c "ls -td snapmath-model/* | head -n 1"
  1. Get the base path of the jupyter container:
docker-compose exec jupyter-notebook pwd
  1. Get the container id of jupyter notebook:
docker ps | grep "jupyter-notebook" 

If you are using docker-compose you can also use the container name;

  1. Now call the script collect-model.sh proving these arguments:
./collect-model.sh jupyter-notebook CONTAINER_BASE_PATH/MODEL_PATH

A concrete example is bellow one:

./collect-model.sh 2c40e8ce7197 /home/jovyan/snapmath-model/1703825980

where:

  • 2c40e8ce7197 is the container jupyter notebook container id
  • /home/jovyan/snapmath-model/1703825980 is the new model to be saved

Exposing the model using TensorFlow Serving

In order to see a trained model running performing the inferences, we can leverage TensorFlow Serving, which is a prepared docker image able to load the model and make it available via REST or gRPC. The Makefile already has a command to expose a collected model in it. First, we need to build the image based on the provided Dockerfile adding the collected models inside of it:

make build-tf-serving

Then we can run:

make run-tf-service

The REST API will be available at localhost:8501, and you can easily call this API using the provided client using Jupyter Notebook.

Next steps

For now I see two interesting points to inspect:

  • Check if is it possible to run this model using Go;

Contribution

Contributions are welcome! If you find something interesting to improve just open a PR :)