Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initialize weights - easy example #2913

Open
FlyinTeller opened this issue Feb 2, 2018 · 4 comments
Open

Initialize weights - easy example #2913

FlyinTeller opened this issue Feb 2, 2018 · 4 comments

Comments

@FlyinTeller
Copy link

When I try to initialize the weights and bias parameters for a Dense layer using the Python API I am getting a TypeError:

l = c.layers.Dense(1000, activation=c.sigmoid, init=weights, init_bias=bias)(input_var)
  File "C:\Users\320000451\AppData\Local\Continuum\miniconda3\envs\py36\lib\site-packages\cntk\layers\layers.py", line 138, in Dense
init_weights = _initializer_for(init, Record(output_rank=output_rank))
File "C:\Users\320000451\AppData\Local\Continuum\miniconda3\envs\py36\lib\site-packages\cntk\layers\blocks.py", line 51, in _initializer_for
init = initializer_with_rank(init, **rank_params)
File "C:\Users\320000451\AppData\Local\Continuum\miniconda3\envs\py36\lib\site-packages\cntk\initializer.py", line 184, in initializer_with_rank
return cntk_py.random_initializer_with_rank(initializer, output_rank, filter_rank)
TypeError: in method 'random_initializer_with_rank', argument 1 of type 'CNTK::ParameterInitializer const &'

Here is what I am doing:

import cntk as c
input_var = c.input_var((180,))
l = c.layers.Dense(1000, activation=c.sigmoid, init=c.glorot_uniform())(input_var) #create Dense layer
weights = l.W.value # extract weights from layer
bias = l.b.value        # extract bias from layer
k = c.layers.Dense(1000, activation=c.sigmoid, init=weights, init_bias=bias)(input_var) #try to create new layer with same weight/bias as before

From the docs it sounds like it should be supported to initialize weights/biases using numpy arrays. Any help is appreciated

@jaliyae
Copy link
Contributor

jaliyae commented Feb 3, 2018

Seems that you cannot set the weights directly. This was a previous answer to the question.
However, since the layer exposes W and b you can assign them values after creating the layer.

@FlyinTeller
Copy link
Author

FlyinTeller commented Feb 3, 2018

To quote the official docs:

init (scalar or NumPy array or cntk.initializer, defaults to glorot_uniform()) – initial value of weights W
init_bias (scalar or NumPy array or cntk.initializer, defaults to 0) – initial value of weights b

So setting the weights and biases should work. It is even added to the answer of the question you linked.
But for some reason it does not seem to be working

@main76
Copy link

main76 commented Feb 6, 2018

Use this for walkaround.

import cntk as c
import numpy as np
input_var = c.input_variable((180,))
l = c.layers.Dense(1000, activation=c.sigmoid, init=c.glorot_uniform())(input_var) #create Dense layer
weights = l.W.value # extract weights from layer
bias = l.b.value        # extract bias from layer
k = c.layers.Dense(1000, activation=c.sigmoid, init_bias=bias)(input_var) #try to create new layer with same weight/bias as before
k.W.value = weights
assert np.array_equal(k.W.value, l.W.value)

It seems that the init only support scalar and cntk.initializer, now. Maybe it is a documentaion error.

def _initializer_for(init, rank_params=None):
    if init is None:
        raise ValueError("init parameter cannot be None")

    # scalar constant: that's it, nothing further to do here
    if np.isscalar(init):
        # BUGBUG: this is sometimes required when dimensions are unknown; shouldn't.
        from _cntk_py import constant_initializer
        return constant_initializer(init)
        #return init # TODO: change to this once this works, e.g. for layers.BatchNormalization()

    # implant additional rank parameters
    if rank_params:
        from cntk.initializer import initializer_with_rank
        init = initializer_with_rank(init, **rank_params)

    return init

@delzac
Copy link
Contributor

delzac commented Mar 10, 2019

Hi, can i confirm that we still can't set weights directly using numpy array?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants