Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issues in maxout layer #22031

Merged
merged 1 commit into from
Sep 11, 2018
Merged

Fix issues in maxout layer #22031

merged 1 commit into from
Sep 11, 2018

Conversation

rogerxcn
Copy link

@rogerxcn rogerxcn commented Sep 3, 2018

Originally pointed out by @ilblackdragon in this pull request:
shape[axis] should = num_units instead of = -1.

The original implementation causes some trouble. For example, if I am to maxout a tensor x = (?, 32, 32, 256) and reduce it into a tensor of shape (?, 32, 32, 128), by using x = tf.contrib.layers.maxout(x, 128), I get a tensor of shape (?, 32, 32, ?) instead of (?, 32, 32, 128).

The shape of the tensor after the maxout can be inferred with num_units, so there is no need to use -1 here. This also causes inconvenience and raises errors when using a dense layer afterwards (or any layer that needs the value of the last dimension).

In addition, the original documentation is problematic in saying that "num_units should be a multiple of 'axis'", and this PR fixes the description.

@martinwicke martinwicke added kokoro:force-run Tests on submitted change ready to pull PR ready for merge process labels Sep 5, 2018
@kokoro-team kokoro-team removed the kokoro:force-run Tests on submitted change label Sep 5, 2018
@tensorflow-copybara tensorflow-copybara merged commit d118516 into tensorflow:master Sep 11, 2018
tensorflow-copybara pushed a commit that referenced this pull request Sep 11, 2018
PiperOrigin-RevId: 212399243
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla: yes ready to pull PR ready for merge process
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants