New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add WeightNorm wrapper layer #21276

Closed
wants to merge 1 commit into
base: master
from

Conversation

Projects
None yet
7 participants
@seanpmorgan
Member

seanpmorgan commented Jul 31, 2018

Adds a wrapper for Weight Normalization as requested in #14070 & #10125 .
Contains optional data dependent initialization for eager execution, and works on both keras.layers and tf.layers

I struggled to figure out where to place this Wrapper, as no other layers in contrib appear to subclass anything from tf.Keras, but going forward I believe this is the direction TF is headed. Please advise if it should go somewhere else (maybe a wrappers module in contrib?)

Collab Example:
https://colab.research.google.com/drive/1nBQSAA78oUBmi9fhnHJ_zWhHq2NXjwIc#scrollTo=au25bSP75hdr

Wrapped graph:
selection_002

@googlebot googlebot added the cla: yes label Jul 31, 2018

@case540 case540 requested a review from sguada Jul 31, 2018

@case540 case540 self-assigned this Jul 31, 2018

@vikaskyadav

haha. looks good.

Add WeightNorm wrapper layer (#14070 #10125)
Fix spelling..

Fix eager run without data_init

@seanpmorgan seanpmorgan force-pushed the seanpmorgan:14070-weight-norm branch from 63e495e to 5b75424 Aug 22, 2018

@dalexander

This comment has been minimized.

dalexander commented Aug 22, 2018

+1

Can this be reviewed/merged?

@tensorflowbutler

This comment has been minimized.

Member

tensorflowbutler commented Dec 6, 2018

Nagging Assignee @case540: It has been 105 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@alexbeloi

This comment has been minimized.

alexbeloi commented Dec 7, 2018

+1

I'm already using this, it would be nice to see it in an official release. Thanks for all the work from contributors and reviewers.

@seanpmorgan

This comment has been minimized.

Member

seanpmorgan commented Dec 7, 2018

This will be moved to a PR for tensorflow/addons.

Closing for now to stop the nagging, but I'll reference this issue within a couple of weeks.

@seanpmorgan seanpmorgan closed this Dec 7, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment