forked from opencv/opencv
-
Notifications
You must be signed in to change notification settings - Fork 6
Intel's Deep Learning Inference Engine backend
Dmitry Kurtaev edited this page Feb 8, 2018
·
3 revisions
Intel's Deep Learning Inference Engine is a part of Intel® Computer Vision SDK. You can use it as a computational backend for OpenCV deep learning module.
-
Download and install Intel's Deep Learning Deployment Toolkit.
-
Build OpenCV specifying path to installed libraries and plugins.
- Ubuntu
cmake \ -DWITH_INF_ENGINE=ON \ -DINTEL_CVSDK_DIR=/opt/intel/deeplearning_deploymenttoolkit/deployment_tools/ \ -DIE_PLUGINS_PATH=/opt/intel/deeplearning_deploymenttoolkit/deployment_tools/inference_engine/lib/ubuntu_16.04/intel64/ \ -DENABLE_CXX11=ON \ ...
- Microsoft Windows
cmake ^ -DWITH_INF_ENGINE=ON ^ -DINTEL_CVSDK_DIR=C:\\Intel\\DeepLearning-DeploymentToolkit_1.0.5852 ^ -DIE_PLUGINS_PATH=C:\\Intel\\DeepLearning-DeploymentToolkit_1.0.5852\\lib\\intel64\\Release ^ -DENABLE_CXX11=ON ^ ...
Add path to Intel's Inference Engine plugins into PATH variable:
set PATH=C:\Intel\DeepLearning-DeploymentToolkit_1.0.5852\bin\intel64\Release;%PATH%
-
Enable Intel's Inference Engine backend right after
cv::dnn::readNetFrom*
invocation:net.setPreferableBackend(DNN_BACKEND_INFERENCE_ENGINE);
-
Home
- Changelog (older)
- New functionality discussion
- Android
- CiteOpenCV
- OpenCVLogo
- Deep Learning in OpenCV
- OpenCV 3
- Development process
- Tutorials
- Computer Vision and Pattern Recognition
- Google summer of code
- Vision challenge
- Workshops