You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The following code shows a LocallyConnected2DLayer with W initialized using HeNormal(1.0) give a 1/(width*height)**0.5 result std than Conv2DLayer with the same initialization.
Thanks for the report, and sorry for the late reply! Indeed, the fan-in is always computed as the product of all weight dimensions except the first: https://github.com/Lasagne/Lasagne/blob/master/lasagne/init.py#L251
This heuristic works for conv1d/2d/3d, but in a locally-connected layer, the weight shape will be (output_channels, input_channels, filter_size0, filter_size1, output_size0, output_size1). The last two dimensions should not be included in the fan-in.
A workaround is to pass gain=np.sqrt(np.prod(network.output_shape[-2:])) to the initializer (making use of the fact that the output size equals the input size, since the implementation only supports same padding and unit stride).
I'm not sure if there's something we can do in LocallyConnected2DLayer about it. We could test whether W is a lasagne.init.Initializer subclass that has a gain attribute, and if so, multiply the result. We could also change the shape passed to the initializer to (output_channels * output_size0 * output_size1, input_channels, filter_size0, filter_size1) and reshape/dimshuffle the result. This would be in the form of a lambda function wrapping the initializer in before passing it to create_param, but only if it is callable.
The following code shows a LocallyConnected2DLayer with W initialized using HeNormal(1.0) give a 1/(width*height)**0.5 result std than Conv2DLayer with the same initialization.
output is
The text was updated successfully, but these errors were encountered: