New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[jsk_perception/apply_mask] Avoid error in cvtColor when masked_image is empty #2725
Conversation
We can reproduce the error with the following launch file.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When an input mask image does not have non-zero values, as of the launch file above, the size of clipped mask (max region) is (0, 0).
If (0, 0) image is passed to cvtColor
, the error could occur.
I confirmed that this problem is caused in noetic and not in melodic. This change never affects current behaviours.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please consider the case where mask_black_to_transparent
is true.
… is empty in case of mask_black_to_transparent is True
@nakane11 Thanks! I fixed the case. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you too!
I confirmed that the error can also be avoided in the case of mask_black_to_transparent=true
.
What is this?
From opencv version 4, cvtColor doesn't support empty matrix.
If
~clip
is true, the output mask size could be(w, h) = (0, 0)
size.In that case, the program dies in the
cvtColor
function forbgra
andrgba
image.This PR fixes the bug.