You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
All right, no one replied. Well, I have got one idea to discriminate the stride and padding mode for nn.SpatialconvolutionMM and nn.SpatialConvolutionCUDA. Keep in mind that SpatialConvolutionMM support stride-mode only in CUDA ( great hint from @soumith ).
To keep short, just list conclusions here:
nn.SpatialConvolutionMM( nInputPlane, nOutPlane, kW, kH, dW, dH, padding ) computes output size according to
out_W = ( inWidth + 2*padding - kW ) / dW + 1
nn.SpatialConvolutionCUDA( nInputPlane, nOutPlane, kW, kH, dW, dH, padding, partialSum ) use the equation below, I have no idea about partialSum, how to use it? @_@
out_W = ( inWidth + padding - kW ) / dW + 1
Thanks to @soumith , the SpatialConvolutionMM uses BDHW layout while SpatialConvolutionCUDA uses 'DHWB' layout.
The convolution network modules part:
3 :
{
padding : 3
kW : 7
nInputPlane : 8
gradBias : CudaTensor - size: 8
dW : 1
gradWeight : CudaTensor - size: 8x392
output : CudaTensor - size: 1x8x61x61
fgradInput : CudaTensor - size: 61x61
finput : CudaTensor - size: 392x3721
bias : CudaTensor - size: 8
weight : CudaTensor - size: 8x392
nOutputPlane : 8
gradInput : CudaTensor - empty
kH : 7
dH : 1
}
4 :
{
padding : 2
kW : 7
nInputPlane : 8
gradBias : CudaTensor - size: 12
dW : 1
gradWeight : CudaTensor - size: 12x392
output : CudaTensor - size: 1x12x59x59
fgradInput : CudaTensor - size: 59x59
finput : CudaTensor - size: 392x3481
bias : CudaTensor - size: 12
weight : CudaTensor - size: 12x392
nOutputPlane : 12
gradInput : CudaTensor - empty
kH : 7
dH : 1
}
As is listed above, the output size of layer 3 is 8x61x61
The layer 4 is defined as nn.SpatialConvolutionMM(8,12,7,7,1,1,3))
from which we should expected output is
(61+3-7)/1+1 = 58,
however, the module info shows that the output is 12x59x59.
Anyone can help me figure out this?
thx~
The text was updated successfully, but these errors were encountered: