Image classification example on Coral with TensorFlow Lite
The Python script takes arguments for the model, labels file, and image you want to process. It then prints the model's prediction for what the image is to the terminal screen.
Set up your device
First, be sure you have completed the setup instructions for your Coral device.
Importantly, you should have the latest TensorFlow Lite runtime installed (as per the Python quickstart
Clone this Git repo onto your computer:
mkdir google-coral && cd google-coral git clone https://github.com/google-coral/tflite --depth 1
Install this example's dependencies:
cd tflite/python/examples/classification ./install_requirements.sh
Run the code
Use this command to run image classification with the model and photo downloaded by the above script (photo shown in figure 1):
python3 classify_image.py \ --model models/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite \ --labels models/inat_bird_labels.txt \ --input images/parrot.jpg
You should see results like this:
Initializing TF Lite interpreter... INFO: Initialized TensorFlow Lite runtime. ----INFERENCE TIME---- Note: The first inference on Edge TPU is slow because it includes loading the model into Edge TPU memory. 11.8ms 3.0ms 2.8ms 2.9ms 2.9ms -------RESULTS-------- Ara macao (Scarlet Macaw): 0.76562
To demonstrate varying inference speeds, the example repeats the same inference five times. Your inference speeds might be different based on your host platform and whether you're using the USB Accelerator with a USB 2.0 or 3.0 connection.
To compare the performance when not using the Edge TPU, try running it again with the model that's not compiled for the Edge TPU:
python3 classify_image.py \ --model models/mobilenet_v2_1.0_224_inat_bird_quant.tflite \ --labels models/inat_bird_labels.txt \ --input images/parrot.jpg