Skip to content
This repository has been archived by the owner on Dec 14, 2022. It is now read-only.

Support loading extension #4

Open
huningxin opened this issue Apr 20, 2020 · 29 comments
Open

Support loading extension #4

huningxin opened this issue Apr 20, 2020 · 29 comments
Assignees

Comments

@huningxin
Copy link
Contributor

It includes loading CPU extensions by AddExtension and GPU extensions by SetConfig.

The native sample code could be found at https://github.com/opencv/dldt/blob/2020/inference-engine/samples/classification_sample_async/main.cpp#L83 for CPU extension and https://github.com/opencv/dldt/blob/2020/inference-engine/samples/classification_sample_async/main.cpp#L89 for GPU extension.

Please check out more details of extension mechanism at: https://docs.openvinotoolkit.org/latest/_docs_IE_DG_Extensibility_DG_Intro.html

For 2020 release, according to comment, all ops are supported by MKLDNNPlugin, so there is no built-in CPU extension. GPU extension is still there.

@dkurt
Copy link

dkurt commented Apr 26, 2020

@huningxin, there are plans to remove GPU extensions from the distribution as well (there were PriorBoxClustered and something else) - they will be a part of the plugin.

But supporting of extensions mechanism is a critical feature for some of the customers, thank you!

@huningxin
Copy link
Contributor Author

@dkurt , thanks for your sharing! It is very helpful.

@lionkunonly
Copy link
Contributor

@huningxin Hi, Ningxin, the APIs has been implemented. But I do not test the API core.setConfig() with a neural network with speacial layer in cldnn.

@huningxin
Copy link
Contributor Author

@dkurt , do you know any test cases that @lionkunonly can test CPU and GPU extensions? Thanks!

@dkurt
Copy link

dkurt commented Apr 28, 2020

For GPU you may try the .cl files from OpenVINO (but they can be merged in the future release). Or just create own CPU extensions library and test (check extgen.py script that generates CPU extensions template).

@lionkunonly
Copy link
Contributor

For GPU you may try the .cl files from OpenVINO (but they can be merged in the future release). Or just create own CPU extensions library and test (check extgen.py script that generates CPU extensions template).

Thanks for your suggestions, I will try.

@huningxin huningxin added this to the M2 milestone Apr 30, 2020
@lionkunonly
Copy link
Contributor

lionkunonly commented Jun 2, 2020

Hi, @dkurt . I have used to the method in this link https://docs.openvinotoolkit.org/latest/extension_build.html to create a simple template extension named libtemplate_extension.so. And I use this file and the file grn.cl to test the API AddExtension and SetConfig.

The AddExtension can load the libtemplate_extension.so without reporting the error, and the SetConfig can load the grn.cl without errors.

But I also try to use the AddExtension to load the GPU extension grn.cl, and use SetConfig to load the libtemplate_extension.so. In my envision, both of them should report error because they load the extensions they should not accepte. But the fact is that they can load these files without errors, too.
I am confused. Do they report errors in native OpenVINO in this situation? If the API run without errors, it means that the extension loaded successfully, is it right? Cloud you give me some suggestions?

@dkurt
Copy link

dkurt commented Jun 2, 2020

@lionkunonly, Hi! As far as I remember AddExtension just appends the libraries to some internal list. There could be multiple libraries as well and I'm not sure that there is a error path for failed extensions loading. They probably are just skipped and plugin tries to find layer implementation in the next added extensions. If there is no extensions or all the libraries failed to load plugin should throw unknown layer exception.

@ilya-lavrenov, @ilyachur, can you add something about AddExtension behavior?

@ilyachur
Copy link

ilyachur commented Jun 2, 2020

@lionkunonly Hi,

@dkurt is right. AddExtension appends the library to some list of extensions and also try to load extensions to registered plugins which support such extension.

Did you try to use SetConfig in order to load extension to the CPU plugin? CPU plugin doesn't throw an error because it ignores this parameter and doesn't parse it.

@lionkunonly
Copy link
Contributor

lionkunonly commented Jun 5, 2020

Did you try to use SetConfig in order to load extension to the CPU plugin? CPU plugin doesn't throw an error because it ignores this parameter and doesn't parse it.

Hi @i lyachur.
Yes, I tried to use SetConfig to load extension to the CPU plugin. But in my experiment, I fixed the device paramenter as GPU. Because I want to see if the SetConfig will report any error, when it tries to load CPU plugin on GPU. And I found there was no error.

@ilyachur
Copy link

ilyachur commented Jun 5, 2020

@lionkunonly Ok, thank you for clarrification.

@vladimir-paramuzov Can you take a look to this issue?

@vladimir-paramuzov
Copy link

Yes, I tried to use SetConfig to load extension to the CPU plugin. But in my experiment, I fixed the device paramenter as GPU. Because I want to see if the SetConfig will report any error, when it tries to load CPU plugin on GPU. And I found there was no error.

@lionkunonly Do I get it correctly that you tried something like:

        std::string e = "libcpu_extensions.so";
        ie.SetConfig({{ CONFIG_KEY(CONFIG_FILE), e }}, "GPU");

and didn't get any errors? I checked this code and it throws an exception like:
Error loading custom layer configuration file: libcpu_extensions.so, No document element found at offset 18360 from here: https://github.com/openvinotoolkit/openvino/blob/master/inference-engine/src/cldnn_engine/cldnn_custom_layer.cpp#L226

Applying such config to CPU plugin leads to [NOT_FOUND] Unsupported property CONFIG_FILE by CPU plugin error.

So if you use SetConfig in some other manner, could you please share code snippet, so we can check it?

@lionkunonly
Copy link
Contributor

@vladimir-paramuzov The code snippet looks like:

InferenceEngine::Core actual_;
std::string extension_absolute_path = "./test/cldnn_global_custom_kernels/libcpu_extension.so";
try {
    actual_.SetConfig({{ie::PluginConfigParams::KEY_CONFIG_FILE, extension_absolute_path}}, "GPU");
  } catch (const std::exception& error) {
    Napi::TypeError::New(env, error.what()).ThrowAsJavaScriptException();
    return;
  } catch (...) {
    Napi::Error::New(env, "Unknown/internal exception happened.")
        .ThrowAsJavaScriptException();
    return;
  }

The implementation is coded by C++ with the N-API of Node.js. The ie is the name of InferenceEngine. I tested the extension file libcpu_extension.so and libtemplate_extension.so again. And none of them throws exception.

@vladimir-paramuzov
Copy link

@lionkunonly OK, I see. As I understand, this happens because ie.SetConfig() call doesn't trigger plugin creation, it just saves the config for not loaded plugins and passes it to the corresponding plugin later (on creation).
You can check it by adding for example ie.GetAvailableDevices(); call before SetConfig. In this case exception will be thrown, because GetAvailableDevices creates all the plugins, so the config will be actually validated immediately on SetConfig call.

To my mind this is expected behavior, but cc @ilya-lavrenov to confirm it.

@ilya-lavrenov
Copy link

Right, behavior of SetConfig which does not create plugin if it's not actually needed, is OK

@lionkunonly
Copy link
Contributor

@lionkunonly OK, I see. As I understand, this happens because ie.SetConfig() call doesn't trigger plugin creation, it just saves the config for not loaded plugins and passes it to the corresponding plugin later (on creation).
You can check it by adding for example ie.GetAvailableDevices(); call before SetConfig. In this case exception will be thrown, because GetAvailableDevices creates all the plugins, so the config will be actually validated immediately on SetConfig call.

Thanks for your help, I got your idea.

@lionkunonly
Copy link
Contributor

@vladimir-paramuzov Hi. Is there any existing native CPU extension or GPU extension to test the API SetConfig and AddExtension? If there is, could you please introduce them to me? Thanks

@ilya-lavrenov
Copy link

@huningxin after you build OpenVINO tests, there is a test extension library bin/intel64/Release/lib/libextension_tests.so

@lionkunonly
Copy link
Contributor

@ilya-lavrenov I give the comment for the error of building OpenVINO on Linux. openvinotoolkit/openvino#844. Cloud you give me some suggestions about it?

@ilya-lavrenov
Copy link

@ilyachur I wonder why template extensions cannot find libraries..

@ilyachur
Copy link

ilyachur commented Jun 16, 2020

Hi @lionkunonly @ilya-lavrenov

I faced with this issue when I had another version of OpenVINO in my environment. For example if your environment (LD_LIBRARY_PATH, PATH or something like that) contains a path to old Inference Engine release which doesn't contain some libraries it can be a root cause of this issue.

@lionkunonly Could you please check that you don't have other version of OpenVINO in your environment?
Because in another way find_package can find incorrect version of the OpenVINO.

find_package(InferenceEngine REQUIRED)

@ilya-lavrenov
Copy link

doesn't contain some libraries it can be a root cause of this issue.

But all libraries are taken as is from this config file. So, if old find_package contains InferenceEngine_LIBRARIES which does not hold IE::inference_engine_c_api, it should not be an issue.

@ilyachur
Copy link

@ilya-lavrenov It is just a proposal... We need to check it. If I remember right someone has the such problem with cmake, and he resolved it when removed other version of OpenVINO from the environment. Maybe I am wrong and forgot the root cause of the original problem, but it really easy to check.

I guess that problem can be because we build documentation together with the OpenVINO and in this case we get such target conflicts because IE CMake registries some targets, but found Inference Engine package doesn't contain such dependencies.

@lionkunonly
Copy link
Contributor

@huningxin after you build OpenVINO tests, there is a test extension library bin/intel64/Release/lib/libextension_tests.so

Hi @ilya-lavrenov, does the libextension_tests.so has been renamed as libtemplate_extension.so in openvino 2020.4 ? If the answer is no, could you please introduce the role palyed by the libtemplate_extension.so? Meanwhile, I also want to know if the libextension_tests.so still exist in openvino 2020.4? I need your suggestions, thanks.

@lionkunonly
Copy link
Contributor

lionkunonly commented Jul 29, 2020

@lionkunonly Could you please check that you don't have other version of OpenVINO in your environment?
Because in another way find_package can find incorrect version of the OpenVINO.

Hi @ilyachur . I downloaded and installed a openvino 2020.1 in the path /opt/intel/<openvino_version> when I meed the issue. Now, I just uninstall the openvino of the old version, and I build openvino without this error successfully. Thanks for your help.

But the libextension_tests.so does not exist in the path bin/intel64/Release/lib/libextension_tests.so. Does this file is removed in the openvino 2020.4?

@dkurt
Copy link

dkurt commented Jul 29, 2020

@lionkunonly, There is no extensions library since 2019R4 if I'm not mistaken. But there is an option in all the demos and apps to specify any custom CPU library (flag -l)

@ilya-lavrenov
Copy link

@lionkunonly libextension_tests.so is used for tests only and not a part of OpenVINO package. libtemplate_extension.so is used for documentation purposes, e.g. it's marked with doxygen anchors and https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_Extensibility_DG_Intro.html is written based on this template extension. Template extension can be used as a start point to create our own extensions.

@ysolovyov
Copy link

ysolovyov commented Oct 2, 2020

How to load default CPU extensions now? Previously I added it like

Core ie;
ie.AddExtension(std::make_shared<Extensions::Cpu::CpuExtensions>(), "CPU");
auto exeNetwork = ie.LoadNetwork(network_, "CPU");
inferRequest_ = exeNetwork.CreateInferRequest();

in 2019.2.242

is this thing would be enough in 2020.4.287? because i've read they were integrated into libMKLDNNPlugin

InferenceEngine::Core ie;
auto exeNetwork = ie.LoadNetwork(network_, "CPU");
inferRequest_ = exeNetwork.CreateInferRequest();

I don't get this from documentation. Also in all examples i see something like

IExtensionPtr extension_ptr = make_so_pointer<IExtension>(FLAGS_l);
ie.AddExtension(extension_ptr, "CPU");

pretty understandable samples where FLAGS_l not even defined in any of the files, what the hell is that FLAGS_l, seems that this is some path to so or dylib file

If i make something like

Core ie;
IExtensionPtr extension_ptr =
      make_so_pointer<IExtension>("./lib/libMKLDNNPlugin.dylib");
ie.AddExtension(extension_ptr, "CPU");
....
auto exeNetwork = ie.LoadNetwork(network_, "CPU");
inferRequest_ = exeNetwork.CreateInferRequest();

i'm getting error

due to unexpected exception with message:
  dlSym cannot locate method 'CreateExtension': dlsym(0x118f9a968,
  CreateExtension): symbol not found

@dkurt
Copy link

dkurt commented Oct 5, 2020

@ysolovyov, you don't need to load extensions manually after 2019.3 release if you have not compiled them. So now AddExtension is used only for user's extensions.

@huningxin huningxin removed this from the M2 milestone Jan 13, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants