DEEP Open Catalogue: Pose Estimation
Author: Lara Lloret Iglesias (CSIC)
Project: This work is part of the DEEP Hybrid-DataCloud project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 777435.
This is a plug-and-play tool for real-time pose estimation using deep neural networks. The original model, weights, code, etc. was created by Google and can be found at https://github.com/tensorflow/tfjs-models/tree/master/posenet. PoseNet can be used to estimate either a single pose or multiple poses, meaning there is a version of the algorithm that can detect only one person in an image/video and one version that can detect multiple persons in an image/video. The module implemented here works on pictures (either uploaded or using an URL) and gives as output the different body keypoints with the corresponding coordinates and the associated key score. It also generates an image with the keypoints superimposed.
To start using this framework run:
git clone https://github.com/deephdc/posenet-tf cd posenet-tf pip install -e .
- This project has been tested in Ubuntu 18.04 with Python 3.6.5. Further package requirements are described in the
- It is a requirement to have Tensorflow>=1.12.0 installed (either in gpu or cpu mode). This is not listed in the
requirements.txtas it breaks GPU support.
python -c 'import cv2'to check that you installed correctly the
opencv-pythonpackage (sometimes dependencies are missed in
Project based on the cookiecutter data science project template. #cookiecutterdatascience
You can test the posenet module on a number of tasks: predict a single local image file (or url) or predict multiple images (or urls).
Running the API
To access this package's complete functionality (both for training and predicting) through an API you have to install the DEEPaaS package:
git clone https://github.com/indigo-dc/deepaas cd deepaas pip install -e .
deepaas-run --listen-ip 0.0.0.0.
From there you will be able to run training and predictions of this package using