This repository contains NLP use cases build using Intel's different AI components with a focus on the OpenVINO™ toolkit. Each use case is supported with detailed documentation present in the respective folders.
Use Case Name | Description | Folder Name |
---|---|---|
Quantization Aware Training and Inference using OpenVINO™ toolkit | An End-to-End NLP workflow with Quantization Aware Training using Optimum-Intel*, and Inference using Optimum-Intel*, OpenVINO™ Model Server & Optimum-ONNX Runtime with OpenVINO™ Execution Provider | question-answering-bert-qat |