Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does TensorFlow provide a half normal initialiser? #38727

Closed
nbro opened this issue Apr 20, 2020 · 1 comment
Closed

Does TensorFlow provide a half normal initialiser? #38727

nbro opened this issue Apr 20, 2020 · 1 comment
Assignees
Labels
type:feature Feature requests

Comments

@nbro
Copy link

nbro commented Apr 20, 2020

System information

  • TensorFlow version (you are using): 2.
  • Are you willing to contribute it (Yes/No): maybe

Describe the feature and the current behavior/state.

I would like to have access to more initializers for variables (not just the weights of layers), but also variables e.g. created with add_weight. Specifically, I would like to have an initializer for half normals, i.e. normals that produce only positive numbers (it's like applying the absolute value to a normal).

Will this change the current api? How?

No.

Who will benefit with this feature?

Everyone.

Any Other info.

I've asked a question on Stack Overflow regarding my specific problem that TF currently doesn't seem to address.

@nbro nbro added the type:feature Feature requests label Apr 20, 2020
@Saduf2019 Saduf2019 assigned ymodak and unassigned Saduf2019 Apr 21, 2020
@nbro
Copy link
Author

nbro commented Apr 23, 2020

Here's a custom initialiser that does what I want.

import tensorflow as tf


def random_half_normal(shape, **kwargs):
    return tf.abs(tf.keras.backend.random_normal(shape, **kwargs))


class MyLayer(tf.keras.layers.Layer):
    def build(self, input_shape):
        self.my_var = self.add_weight(initializer=random_half_normal, 
                                      trainable=False)

    def call(self, inputs):
        tf.print("\nself.my_var =", self.my_var)
        return inputs


def get_model():
    inp = tf.keras.layers.Input(shape=(1,))
    out = MyLayer(8)(inp)
    model = tf.keras.Model(inputs=inp, outputs=out)
    model.summary()
    return model


def train():
    model = get_model()
    model.compile(optimizer="adam", loss="mae")
    x_train = [2, 3, 4, 1, 2, 6]
    y_train = [1, 0, 1, 0, 1, 1]
    model.fit(x_train, y_train)


if __name__ == '__main__':
    train()

See also my answer on Stack Overflow.

@nbro nbro closed this as completed Apr 23, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:feature Feature requests
Projects
None yet
Development

No branches or pull requests

3 participants