Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LocallyConnected2DLayer params not initialized correctly #918

Open
guoxuesong opened this issue May 26, 2019 · 1 comment
Open

LocallyConnected2DLayer params not initialized correctly #918

guoxuesong opened this issue May 26, 2019 · 1 comment

Comments

@guoxuesong
Copy link

The following code shows a LocallyConnected2DLayer with W initialized using HeNormal(1.0) give a 1/(width*height)**0.5 result std than Conv2DLayer with the same initialization.

import numpy as np
import theano
import theano.tensor as T

import lasagne
from lasagne.layers import *

input_var = T.tensor4('inputs')
def build_network(input_var, using_local):
    network = InputLayer(shape=(None,3,64,64), input_var=input_var)
    if using_local:
        network = LocallyConnected2DLayer(
                network, num_filters=256, filter_size=(3,3), untie_biases=True, pad='same',
                nonlinearity=None,
                W=lasagne.init.HeNormal(1.0)
                )
    else:
        network = Conv2DLayer(
                network, num_filters=256, filter_size=(3,3), pad='same',
                nonlinearity=None,
                W=lasagne.init.HeNormal(1.0)
                )
    return network

local_fn = theano.function([input_var],get_output(build_network(input_var,True)).std())
conv_fn = theano.function([input_var],get_output(build_network(input_var,False)).std())

data = np.random.normal(0,1,(64,3,64,64)).astype('float32')
print local_fn(data)
print conv_fn(data)

output is

0.015465997
0.9949956

@f0k
Copy link
Member

f0k commented Jun 20, 2019

Thanks for the report, and sorry for the late reply! Indeed, the fan-in is always computed as the product of all weight dimensions except the first: https://github.com/Lasagne/Lasagne/blob/master/lasagne/init.py#L251
This heuristic works for conv1d/2d/3d, but in a locally-connected layer, the weight shape will be (output_channels, input_channels, filter_size0, filter_size1, output_size0, output_size1). The last two dimensions should not be included in the fan-in.

A workaround is to pass gain=np.sqrt(np.prod(network.output_shape[-2:])) to the initializer (making use of the fact that the output size equals the input size, since the implementation only supports same padding and unit stride).

I'm not sure if there's something we can do in LocallyConnected2DLayer about it. We could test whether W is a lasagne.init.Initializer subclass that has a gain attribute, and if so, multiply the result. We could also change the shape passed to the initializer to (output_channels * output_size0 * output_size1, input_channels, filter_size0, filter_size1) and reshape/dimshuffle the result. This would be in the form of a lambda function wrapping the initializer in before passing it to create_param, but only if it is callable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants