You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm working on a few extensions and basing my work on this mxnet image captioning implementation that you built. However, on some of my own custom data, I'm getting the following error:
mxnet.base.MXNetError: Error in operator split0: [16:42:17] src/operator/./slice_channel-inl.h:208: Check failed: dshape[real_axis] % param_.num_outputs == 0U (262 vs. 0) You are trying to split the 1-th axis of input tensor with shape [50,262,256] into num_outputs=494 evenly sized chunks, but this is not possible because 494 does not evenly divide 262
I'm using a batch size of 50, embedding dim of 256 and the word data is of size 262 it seems like from the input tensor shape.
Do you know where num_outputs=494 is set and why this error might be happening? I'm running your code right away without any modifications other than using my own data. It seems like the error is happening while doing the forward pass on the validation set.
Not sure if this is a bug.
Thanks!
The text was updated successfully, but these errors were encountered:
Hi,
I'm working on a few extensions and basing my work on this mxnet image captioning implementation that you built. However, on some of my own custom data, I'm getting the following error:
mxnet.base.MXNetError: Error in operator split0: [16:42:17] src/operator/./slice_channel-inl.h:208: Check failed: dshape[real_axis] % param_.num_outputs == 0U (262 vs. 0) You are trying to split the 1-th axis of input tensor with shape [50,262,256] into num_outputs=494 evenly sized chunks, but this is not possible because 494 does not evenly divide 262
I'm using a batch size of 50, embedding dim of 256 and the word data is of size 262 it seems like from the input tensor shape.
Do you know where num_outputs=494 is set and why this error might be happening? I'm running your code right away without any modifications other than using my own data. It seems like the error is happening while doing the forward pass on the validation set.
Not sure if this is a bug.
Thanks!
The text was updated successfully, but these errors were encountered: