Skip to content
Branch: master
Find file History
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
..
Failed to load latest commit information.
README.md
nginx.conf
predictor.py
requirements.txt
serve
wsgi.py

README.md

Inference server

Dependencies

  • NGINX. You can install it with sudo apt-get install nginx if you are using an Ubuntu OS.
  • poppler-utils to convert pdf to images: sudo apt-get install poppler-utils.
  • Library dependencies for the python code. You need to install these with pip install -r requirements.txt before you can run this.

NOTE: This server has been tested in Ubuntu 16.04 and Python 3.5.

How to run it

Download model

The model to perform document classification has been hosted in Google Drive. You can download it from: https://drive.google.com/file/d/1wJUnkFiqmwok1gJ2sKHPvJIaNjfT4pk6/view?usp=sharing

Please, move the downloaded file model.hdf5 to the inference_server folder.

Environment variables

Set the following environment variables:

Parameter Environment Variable Default Value
number of workers MODEL_SERVER_WORKERS the number of CPU cores
timeout MODEL_SERVER_TIMEOUT 120 seconds
nginx config path NGINX_CONF_PATH /etc/nginx/nginx.conf

Example:

export MODEL_SERVER_WORKERS=1
export MODEL_SERVER_TIMEOUT=120
export NGINX_CONF_PATH=/home/user/UiPath_Document_Classification/inference_server/nginx.conf

Run the inference server

By default, this server uses the port 1234. Run it with the following command:

sudo -E ./serve

Time to try it!

Send a file to the server using curl:

curl -X POST -F "file=@PATH_TO_YOUR_FILE" "http://localhost:1234/document_classification"

Result:

{
    "prediction": {
        "confidence": "1.0",
        "class": "invoice"
    },
    "confidences": {
        "invoice": "1.000",
        "passport": "0.000",
        "id_card_2": "0.000",
        "driving_licence": "0.000",
        "id_card_3": "0.000"
    }
}
You can’t perform that action at this time.