Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dnn: missed fusion opportunities by using case-sensitive string comparisions to identify layer types #16877

Closed
4 tasks done
YashasSamaga opened this issue Mar 22, 2020 · 3 comments · Fixed by #16878
Closed
4 tasks done

Comments

@YashasSamaga
Copy link
Contributor

YashasSamaga commented Mar 22, 2020

System information (version)
Detailed description
[FORWARD] Convolution 557
[SKIPPED] BatchNorm 558
[FORWARD] Relu 559
[FORWARD] Convolution 560
[FORWARD] Convolution 561
[SKIPPED] BatchNorm 562
[FORWARD] Relu 563
code to obtain the above skipped/forward summary `std::cout << (ld.skip ? "[SKIPPED]" : "[FORWARD]") << ' ' << ld.type << ' ' << ld.name << std::endl;`

in forwardLayer

Relu layers are not fused with convolution. It's because Relu is not the same as ReLU for a case-sensitive string comparison which is what fuseLayers uses to identify layers for fusion.

Steps to reproduce

ONNX model (taken from #16869)

Issue submission checklist
  • I report the issue, it's not a question
  • I checked the problem with documentation, FAQ, open issues,
    answers.opencv.org, Stack Overflow, etc and have not found solution
  • I updated to latest OpenCV version and the issue is still there
  • There is reproducer code and related data files: videos, images, onnx, etc
@dkurt
Copy link
Member

dkurt commented Mar 22, 2020

That's interesting. But how layer "Relu" is initialized while only "ReLU" is supported? I though it should throw something like

CV_Error(Error::StsError, "Can't create layer \"" + name + "\" of type \"" + type + "\"");

source:

CV_DNN_REGISTER_LAYER_CLASS(ReLU, ReLULayer);

In example, for TensorFlow there is even remap for origin types:

std::string dnnType = type;
if (type == "Abs") dnnType = "AbsVal";
else if (type == "Tanh") dnnType = "TanH";
else if (type == "Relu") dnnType = "ReLU";
else if (type == "Relu6") dnnType = "ReLU6";
else if (type == "Elu") dnnType = "ELU";

Can reproduce that layer types are weird:

[FORWARD] Convolution 553
[FORWARD] MVN 554/MVN
[SKIPPED] BatchNorm 554
[FORWARD] Relu 555
[FORWARD] Pooling 556
[FORWARD] Convolution 557
[SKIPPED] BatchNorm 558
[SKIPPED] Relu 559

(tested on CPU but it matches by dynamic_cast however for OpenCL/CUDA backends it's a problem. Thanks @YashasSamaga!).

@dkurt dkurt added this to the 3.4.10 milestone Mar 22, 2020
@YashasSamaga
Copy link
Contributor Author

YashasSamaga commented Mar 22, 2020

I couldn't find ONNX importer code using ReLU anywhere (it uses ReLU6 and what not but just ReLU isn't there anywhere). This bug might be specific to the ONNX importer.

Here is my guess on how this bug might have happened.

You can find Relu when you open the ONNX model in a hex editor. So Relu as a string (probably as an op type) is present in the ONNX model binary.

The ONNX importer uses the type name from node_proto (which is probably directly read off the ONNX file which can be Relu):

std::string layer_type = node_proto.op_type();
layerParams.type = layer_type;

Layer factory uses toLowerCase on the type to find layers.

opencv/modules/dnn/src/dnn.cpp

Lines 4985 to 5003 in 760e9e0

Ptr<Layer> LayerFactory::createLayerInstance(const String &type, LayerParams& params)
{
CV_TRACE_FUNCTION();
CV_TRACE_ARG_VALUE(type, "type", type.c_str());
cv::AutoLock lock(getLayerFactoryMutex());
String type_ = toLowerCase(type);
LayerFactory_Impl::const_iterator it = getLayerFactoryImpl().find(type_);
if (it != getLayerFactoryImpl().end())
{
CV_Assert(!it->second.empty());
return it->second.back()(params);
}
else
{
return Ptr<Layer>(); //NULL
}
}

So even irrespective of whether the type is ReLU or Relu or relu, a ReLU layer will be successfully created.

@dkurt
Copy link
Member

dkurt commented Mar 22, 2020

Please take a look at #16878.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants