Skip to content

Latest commit

 

History

History
158 lines (135 loc) · 7.85 KB

File metadata and controls

158 lines (135 loc) · 7.85 KB

ResNet50 v1.5 inference

Description

This document has instructions for running ResNet50 v1.5 inference using Intel-optimized TensorFlow.

Datasets

Download and preprocess the ImageNet dataset using the instructions here. After running the conversion script you should have a directory with the ImageNet dataset in the TF records format.

Set the DATASET_DIR to point to the TF records directory when running ResNet50 v1.5.

Quick Start Scripts

Script name Description
inference.sh Runs realtime inference using a default batch_size=1 for the specified precision (int8, fp32, bfloat16 or fp16). To run inference for throughtput, set BATCH_SIZE environment variable.
inference_realtime_multi_instance.sh Runs multi instance realtime inference using 4 cores per instance for the specified precision (fp32, int8, bfloat16, fp16, bfloat32) with 1500 steps and 50 warmup steps. If no DATASET_DIR is set, synthetic data is used. Waits for all instances to complete, then prints a summarized throughput value.
inference_realtime_weightsharing.sh Runs multi instance realtime inference with weight sharing for the specified precision (int8 or bfloat16) with 1500 steps and 100 warmup steps. If no DATASET_DIR is set, synthetic data is used. Waits for all instances to complete, then prints a summarized throughput value.
inference_throughput_multi_instance.sh Runs multi instance batch inference using 1 instance per socket for the specified precision (fp32, int8, bfloat16, fp16, bfloat32) with 1500 steps and 50 warmup steps. If no DATASET_DIR is set, synthetic data is used. Waits for all instances to complete, then prints a summarized throughput value.
accuracy.sh Measures the inference accuracy (providing a DATASET_DIR environment variable is required) for the specified precision (fp32, int8, bfloat16, fp16, bfloat32).

Run the model

Setup your environment using the instructions below, depending on if you are using AI Tools:

Setup using AI Tools on Linux Setup without AI Tools on Linux Setup without AI Tools on Windows

To run using AI Tools on Linux you will need:

  • numactl
  • wget
  • openmpi-bin (only required for multi-instance)
  • openmpi-common (only required for multi-instance)
  • openssh-client (only required for multi-instance)
  • openssh-server (only required for multi-instance)
  • libopenmpi-dev (only required for multi-instance)
  • horovod==0.27.0 (only required for multi-instance)
  • intel-extension-for-tensorflow (only required when using onednn graph optimization)
  • Activate the tensorflow conda environment
    conda activate tensorflow

To run without AI Tools on Linux you will need:

  • Python 3
  • intel-tensorflow>=2.5.0
  • git
  • numactl
  • wget
  • openmpi-bin (only required for multi-instance)
  • openmpi-common (only required for multi-instance)
  • openssh-client (only required for multi-instance)
  • openssh-server (only required for multi-instance)
  • libopenmpi-dev (only required for multi-instance)
  • horovod==0.27.0 (only required for multi-instance)
  • intel-extension-for-tensorflow (only required when using onednn graph optimization)
  • A clone of the AI Reference Models repo
    git clone https://github.com/IntelAI/models.git

To run without AI Tools on Windows you will need:

After finishing the setup above, download the pretrained model based on PRECISION and set the PRETRAINED_MODEL environment var to the path to the frozen graph. If you run on Windows, please use a browser to download the pretrained model using the link below. For Linux, run:

# FP32, FP16 and BFloat32 Pretrained model:
wget https://zenodo.org/record/2535873/files/resnet50_v1.pb
export PRETRAINED_MODEL=$(pwd)/resnet50_v1.pb

# Int8 Pretrained model:
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v1_8/resnet50v1_5_int8_pretrained_model.pb
export PRETRAINED_MODEL=$(pwd)/resnet50v1_5_int8_pretrained_model.pb

# Only used when the plugin Intel Extension for Tensorflow is installed, as OneDNN Graph optimization is enabled by default at this point. Int8 Pretrained model for OneDNN Graph:
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/2_12_0/rn50_itex_int8.pb
export PRETRAINED_MODEL=$(pwd)/rn50_itex_int8.pb

#BFloat16 Pretrained model:
wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v1_8/resnet50_v1_5_bfloat16.pb
export PRETRAINED_MODEL=$(pwd)/resnet50_v1_5_bfloat16.pb

Set the environment variables and run quickstart script on either Linux or Windows systems. See the list of quickstart scripts for details on the different options.

Run on Linux

# cd to your AI Reference Models directory
cd models

export PRETRAINED_MODEL=<path to the frozen graph downloaded above>
export DATASET_DIR=<path to the ImageNet TF records>
export PRECISION=<set the precision to "int8" or "fp32" or "bfloat16" or "fp16" or "bfloat32">
export OUTPUT_DIR=<path to the directory where log files and checkpoints will be written>
# For a custom batch size, set env var `BATCH_SIZE` or it will run with a default value.
export BATCH_SIZE=<customized batch size value>

./quickstart/image_recognition/tensorflow/resnet50v1_5/inference/cpu/<script name>.sh

Run on Windows

Using cmd.exe, run:

# cd to your AI Reference Models directory
cd models

set PRETRAINED_MODEL=<path to the frozen graph downloaded above>
set DATASET_DIR=<path to the ImageNet TF records>
set PRECISION=<set the precision to "int8" or "fp32">
set OUTPUT_DIR=<directory where log files will be written>
# For a custom batch size, set env var `BATCH_SIZE` or it will run with a default value.
set BATCH_SIZE=<customized batch size value>

# Run a quickstart script (inference.sh and accuracy.sh are supported on windows) 
bash quickstart\image_recognition\tensorflow\resnet50v1_5\inference\cpu\<script name>.sh

Note: You may use cygpath to convert the Windows paths to Unix paths before setting the environment variables. As an example, if the dataset location on Windows is D:\user\ImageNet, convert the Windows path to Unix as shown:

cygpath D:\user\ImageNet
/d/user/ImageNet

Then, set the DATASET_DIR environment variable set DATASET_DIR=/d/user/ImageNet.

Additional Resources