Skip to content

Latest commit

 

History

History
23 lines (16 loc) · 1.46 KB

File metadata and controls

23 lines (16 loc) · 1.46 KB

Accelerate Inference of MobileNet V2 Image Classification Model with NNCF in OpenVINO™

Binder Colab

This tutorial demonstrates how to apply INT8 quantization to the MobileNet V2 Image Classification model, using the NNCF Post-Training Quantization API. The tutorial uses MobileNetV2 and Cifar10 dataset. The code of the tutorial is designed to be extendable to custom models and datasets.

Notebook Contents

The tutorial consists of the following steps:

  • Prepare the model for quantization.
  • Define a data loading functionality.
  • Perform quantization.
  • Compare accuracy of the original and quantized models.
  • Compare performance of the original and quantized models.
  • Compare results on one picture.

Installation Instructions

If you have not installed all required dependencies, follow the Installation Guide.