Skip to content
This repository has been archived by the owner on Mar 4, 2022. It is now read-only.
/ optimum-openvino Public archive

Intel OpenVINO extension for Hugging Face Transformers

License

Notifications You must be signed in to change notification settings

dkurt/optimum-openvino

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

⚠️ This project is transfered under OpenVINO organization into https://github.com/openvinotoolkit/openvino_contrib. Please consider using pip install openvino-optimum to get the latest and greatest package. Thanks for using and starring the project!

Optimum OpenVINO

OpenVINO Runtime NNCF

Optimum OpenVINO is an extension for Optimum library which brings Intel OpenVINO backend for Hugging Face Transformers 🤗.

This project provides multiple APIs to enable different tools:

Install

Install only runtime:

pip install optimum-openvino

or with all dependencies:

pip install optimum-openvino[all]

OpenVINO Runtime

This module provides an inference API for Hugging Face models. There are options to use models with PyTorch*, TensorFlow* pretrained weights or use native OpenVINO IR format (a pair of files ov_model.xml and ov_model.bin).

To use OpenVINO backend, import one of the AutoModel classes with OV prefix. Specify a model name or local path in from_pretrained method.

from optimum.intel.openvino import OVAutoModel

# PyTorch trained model with OpenVINO backend
model = OVAutoModel.from_pretrained(<name_or_path>, from_pt=True)

# TensorFlow trained model with OpenVINO backend
model = OVAutoModel.from_pretrained(<name_or_path>, from_tf=True)

# Initialize a model from OpenVINO IR
model = OVAutoModel.from_pretrained(<name_or_path>)

NNCF

NNCF is used for model training with applying such features like quantization, pruning. To enable NNCF in you training pipeline do the following steps:

  1. Import NNCFAutoConfig:
from optimum.intel.nncf import NNCFAutoConfig

NOTE: NNCFAutoConfig must be imported before transformers to make magic work

  1. Initialize a config from .json file:
nncf_config = NNCFAutoConfig.from_json(training_args.nncf_config)
  1. Pass a config to Trainer object. In example,
trainer = QuestionAnsweringTrainer(
    model=model,
    args=training_args,
    train_dataset=train_dataset if training_args.do_train else None,
    eval_dataset=eval_dataset if training_args.do_eval else None,
    eval_examples=eval_examples if training_args.do_eval else None,
    tokenizer=tokenizer,
    data_collator=data_collator,
    post_process_function=post_processing_function,
    compute_metrics=compute_metrics,
    nncf_config=nncf_config,
)

Training examples can be found in Transformers library. NNCF configs are published in config folder. Add --nncf_config with a path to corresponding config when train your model. More command line examples here.

python examples/pytorch/token-classification/run_ner.py --model_name_or_path bert-base-cased --dataset_name conll2003 --output_dir bert_base_cased_conll_int8 --do_train --do_eval --save_strategy epoch --evaluation_strategy epoch --nncf_config nncf_bert_config_conll.json

To use NNCF component, install the package with [nncf] or [all] extras:

pip install optimum-openvino[nncf]

POT

TBD