New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What will mxnet convolution do if the dilate shape is greater than the input shape #3479

Closed
shuokay opened this Issue Oct 8, 2016 · 6 comments

Comments

Projects
None yet
4 participants
@shuokay
Contributor

shuokay commented Oct 8, 2016

For example, the input shape is (1,100) and the dilate is (3, 3).

I have tried to set the dilate to (1,3), and the error is
mxnet/mshadow/mshadow/././extension/pack_col2patch.h:53: Check failed: (sshape[1]) == (o_height * o_width * imshape.ProdShape(0, dstdim - 3)) PackColToPatchEx p: src.size(1) mismatch

@feiyulv

This comment has been minimized.

Show comment
Hide comment
@feiyulv

feiyulv Oct 9, 2016

That may be a bug in the backward of dilation convolution
change the following code of convolution-inl.h:246

      if (param_.pad[0] == 0 && param_.pad[1] == 0) {
        Assign(gdata.Slice(i, i + step), req[conv::kData],
               pack_col2patch(temp_col,
                              data.Slice(i, i + step).shape_,
                              param_.kernel[0],
                              param_.kernel[1],
                              param_.stride[0],
                              param_.dilate[0]));
      } else {
        Shape<4> pshape = data.Slice(i, i + step).shape_;
        pshape[2] += 2 * param_.pad[0];
        pshape[3] += 2 * param_.pad[1];
        Assign(gdata.Slice(i, i + step), req[conv::kData],
               crop(pack_col2patch(temp_col,
                                   pshape,
                                   param_.kernel[0],
                                   param_.kernel[1],
                                   param_.stride[0],
                                   param_.dilate[0]),
                    gdata[i][0].shape_));
      }

to

  if (param_.pad[0] == 0 && param_.pad[1] == 0) {
        Assign(gdata.Slice(i, i + step), req[conv::kData],
               pack_col2patch(temp_col,
                              data.Slice(i, i + step).shape_,
                              param_.kernel[0],
                              param_.kernel[1],
                              param_.stride[0],
                              param_.stride[1],
                              param_.dilate[0],
                              param_.dilate[1]));
      } else {
        Shape<4> pshape = data.Slice(i, i + step).shape_;
        pshape[2] += 2 * param_.pad[0];
        pshape[3] += 2 * param_.pad[1];
        Assign(gdata.Slice(i, i + step), req[conv::kData],
               crop(pack_col2patch(temp_col,
                                   pshape,
                                   param_.kernel[0],
                                   param_.kernel[1],
                                   param_.stride[0],
                                   param_.stride[1],
                                   param_.dilate[0],
                                   param_.dilate[1]),
                    gdata[i][0].shape_));
      }

feiyulv commented Oct 9, 2016

That may be a bug in the backward of dilation convolution
change the following code of convolution-inl.h:246

      if (param_.pad[0] == 0 && param_.pad[1] == 0) {
        Assign(gdata.Slice(i, i + step), req[conv::kData],
               pack_col2patch(temp_col,
                              data.Slice(i, i + step).shape_,
                              param_.kernel[0],
                              param_.kernel[1],
                              param_.stride[0],
                              param_.dilate[0]));
      } else {
        Shape<4> pshape = data.Slice(i, i + step).shape_;
        pshape[2] += 2 * param_.pad[0];
        pshape[3] += 2 * param_.pad[1];
        Assign(gdata.Slice(i, i + step), req[conv::kData],
               crop(pack_col2patch(temp_col,
                                   pshape,
                                   param_.kernel[0],
                                   param_.kernel[1],
                                   param_.stride[0],
                                   param_.dilate[0]),
                    gdata[i][0].shape_));
      }

to

  if (param_.pad[0] == 0 && param_.pad[1] == 0) {
        Assign(gdata.Slice(i, i + step), req[conv::kData],
               pack_col2patch(temp_col,
                              data.Slice(i, i + step).shape_,
                              param_.kernel[0],
                              param_.kernel[1],
                              param_.stride[0],
                              param_.stride[1],
                              param_.dilate[0],
                              param_.dilate[1]));
      } else {
        Shape<4> pshape = data.Slice(i, i + step).shape_;
        pshape[2] += 2 * param_.pad[0];
        pshape[3] += 2 * param_.pad[1];
        Assign(gdata.Slice(i, i + step), req[conv::kData],
               crop(pack_col2patch(temp_col,
                                   pshape,
                                   param_.kernel[0],
                                   param_.kernel[1],
                                   param_.stride[0],
                                   param_.stride[1],
                                   param_.dilate[0],
                                   param_.dilate[1]),
                    gdata[i][0].shape_));
      }
@winstywang

This comment has been minimized.

Show comment
Hide comment
@winstywang

winstywang Oct 10, 2016

Contributor

@feiyulv Could you submit a PR on this?

Contributor

winstywang commented Oct 10, 2016

@feiyulv Could you submit a PR on this?

@winstywang

This comment has been minimized.

Show comment
Hide comment
@winstywang

winstywang Oct 10, 2016

Contributor

It may relate to this PR: #2365

Contributor

winstywang commented Oct 10, 2016

It may relate to this PR: #2365

@feiyulv

This comment has been minimized.

Show comment
Hide comment
@feiyulv

feiyulv Oct 10, 2016

@winstywang yes, the same problem

feiyulv commented Oct 10, 2016

@winstywang yes, the same problem

@winstywang

This comment has been minimized.

Show comment
Hide comment
@winstywang

winstywang Oct 12, 2016

Contributor

@feiyulv Could you submit a PR on this? The old PR is inactive for a long time.

Contributor

winstywang commented Oct 12, 2016

@feiyulv Could you submit a PR on this? The old PR is inactive for a long time.

@yajiedesign

This comment has been minimized.

Show comment
Hide comment
@yajiedesign

yajiedesign Sep 28, 2017

Contributor

This issue is closed due to lack of activity in the last 90 days. Feel free to reopen if this is still an active issue. Thanks!

Contributor

yajiedesign commented Sep 28, 2017

This issue is closed due to lack of activity in the last 90 days. Feel free to reopen if this is still an active issue. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment