dfxdemo is a simple Python-based demo that demonstrates how to use the
DeepAffex™ Extraction Library and DeepAffex™ Cloud API.
The demo can extract facial blood-flow from a video file or from a webcam, send it to the DeepAffex™ Cloud for processing and display the results. (This process is called 'making a measurement'.) It can also be used to display historical results or to view study details.
If you haven't already done so, the first step is to register for a DeepAffex™ developer account and request a cloud API license key. You can do this by visiting the DFX Dashboard and selecting the Sign Up link at the bottom.
Once you are logged in to the Dashboard, you will see a Licenses section on the left. Clicking on it will reveal your organization license that you can use in the subsequent steps. Please click on the eye-icon (👁️) to reveal your license and copy and save it locally. (You will need it for registering later.)
Please ensure you have the following software installed:
Note: Please see the section Using Docker for an alternative. Note: Please see the section Raspberry Pi if installing on a Raspberry Pi.
Clone the dfxdemo application from Github.
git clone https://github.com/nuralogix/dfx-demo-pyCreate a Python virtual environment inside the cloned repo, activate it and
upgrade pip. (On Ubuntu, you may need to run
sudo apt-get install python3-venv to enable venv support.)
cd dfx-demo-py
python3 -m venv venv # on Windows: python -m venv venv
source venv/bin/activate # on Windows: venv\Scripts\activate
python -m pip install --upgrade pip setuptools wheelDownload the
Python wheel for the DeepAffex™ Extraction Library
for your platform to the wheels/ folder.
Install dfxdemo in editable mode (and automatically install other
dependencies.)
pip install -e ".[mediapipe]" -f wheels/Note: Please see the section Using Dlib for an alternative.
dfxdemo has top-level commands that roughly correspond to the way the DFX API
is organized. All commands and subcommands have a --help argument.
Register your organization license on the DeepAffex™ Cloud to obtain a device token. This is generally the first thing you have to do (unless you don't want to make a measurement.)
dfxdemo org register <your_license_key>You can leave <your_license_key> blank to read it via the environment variable
DFXDEMO_LICENSE or enter it securely in the terminal.
Note: By default, the demo stores tokens in a file called config.json.
In a production application, you will need to manage all tokens securely.
If registration fails, please check the number of registered devices in the DeepAffex™ dashboard and ensure that you have not reached the active device limit.
Login as a user to get a user token. For the purposes of this demo, you need this step since a user token is necessary to list studies (the next section.) In a production app, you will likely use Anonymous Measurements with a fixed Study ID.
dfxdemo user login <email> <password>You can leave blank to read it via the environment variable
DFXDEMO_PASSWORD or enter it securely in the terminal.
Note: All the commands below, use the tokens obtained above.
List the available DFX Studies and retrieve the details of the one you want to use.
Note: The DeepAffex™ Cloud organizes around the concept of Studies - a DFX Study is a collection of biosignals of interest that are computed in one measurement.
dfxdemo studies list
dfxdemo study get <study_id>Select a study for use in measurements.
dfxdemo study select <study_id>Make a measurement from a video using the selected study
dfxdemo measure make /path/to/video_fileor
Make a measurement from a webcam using the selected study
dfxdemo measure make_cameraRetrieve detailed results of the last measurement you made
dfxdemo measure getList your historical measurements
dfxdemo measurements listIf you intend to use the DeepAffex™ Cloud API in mainland China, please use the
the --rest-url option of the dfxdemo org register command as shown below:
如果您在中国大陆使用DeepAffex™云端API, 在使用dfxdemo org register命令时, 请使用--rest-url选项,
如下所示。
dfxdemo org register --rest-url https://api.deepaffex.cn/ <your_license_key>You can experiment with dfxdemo using Docker by following the instructions
below. There are a few limitations:
measure make_camerawill not work since the container doesn't have access to a camera--headlessneeds to be passed tomeasure makesince the container doesn't have access to a X-server.
docker build . -t dfxdemo --build-arg EXTRAS_REQUIRE=mediapipe
docker image prune -f # OptionalIn the commands below, please replace ${PWD} with %CD% on Windows.
# To run most commands, use this, assuming ${PWD} contains config.json etc.
docker run -it --rm -v ${PWD}:/app dfxdemo org register <your_license_key>
# To run `measure make`, use this, updating /path/to/videos to a path on your machine...
docker run -it --rm -v ${PWD}:/app -v /path/to/videos:/videos dfxdemo measure make /videos/video_fileIf you don't need to make measurements, then you can build without a facetracker.
docker build . -t dfxdemo --build-arg EXTRAS_REQUIRE=dummyYou can use Dlib instead of MediaPipe as the face-tracker. Since Dlib doesn't distribute as pre-compiled Python wheels, you will need to install a C++ compiler and toolchain, capable of compiling Python extensions. Depending upon your platform, please install:
-
Windows: Visual Studio 2022 or newer
-
macOS (untested): Xcode
-
Linux: gcc and Python development libraries
Note: On Ubuntu 22.04, the following commands should work
sudo apt-get install build-essential # Compiler and build tools
sudo apt-get install python3-dev
sudo apt-get install libopenblas-dev liblapack-dev # for DlibTo install dfxdemo with Dlib, in your activated Python virtual environment,
run:
pip install --upgrade cmake
pip install -e ".[dlib]"Please download and unzip the Dlib face landmarks dat file into the 'res' folder.
dfxdemo has basic support for Raspberry Pi. We have verified that it works on
these Raspberry Pi models:
- Raspberry Pi 5 Model B Rev 1.1 (16GB RAM) on Raspberry Pi OS 64-bit (bookworm)
- Raspberry Pi Compute Module 5 Rev 1.0 (8GB RAM) on Raspberry Pi OS 64-bit (bookworm)
with these camera modules:
- Raspberry Pi Camera Module 3
- Raspberry Pi Camera Module 3 NoIR and
- Raspberry Pi Camera Module 2.
Update your Raspberry Pi and install Picamera2 using apt.
sudo apt update
sudo apt full-upgrade
sudo apt install python3-picamera2After cloning the repo, download the
Linux ARM64 Python wheel for libdfx to the
wheels/ folder.
When creating the virtual environment, pass the --system-site-packages flag so
that the Picamera2 Python package is picked up.
python -m venv venv --system-site-packagesIn the activated venv, install dfxdemo in editable mode (and automatically
install other dependencies.)
pip install -e ".[mediapipe]" -f wheels/After you have registered your license and logged in, you can list the cameras attached to your Pi and their ids.
dfxdemo picameras listYou can then choose and configure one of the listed cameras to make a measurement.
dfxdemo measure make_camera --picamera --params 640x480@30 --camera <id>For better face tracker performance, you can install an older version of
Mediapipe and select that as facetracker using -ft mediapipe.
pip install mediapipe==0.10.9