Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

supporting N-D Blobs in Dropout layer Reshape #3725

Merged
merged 1 commit into from
Feb 28, 2016

Conversation

shaibagon
Copy link
Member

Reshape method of Dropout layer reshapes rand_vec_ using deprecated shape methods (num(), channels(), height() and width()).
Replacing these methods with newer shape() allows Dorpout layer to work on Blobs with num_axes()>4.

This PR is a small patch

Thank you!

@shaibagon shaibagon closed this Feb 25, 2016
@shaibagon shaibagon reopened this Feb 25, 2016
@shaibagon shaibagon closed this Feb 25, 2016
@shaibagon shaibagon reopened this Feb 25, 2016
@jeffdonahue
Copy link
Contributor

Thanks @shaibagon, LGTM. Please squash your changes to a single commit and I'll merge this.

@shaibagon
Copy link
Member Author

@jeffdonahue squashed.
Thanks!
BTW, Travis CI seems to have some problems lately, I hope this will not interfere with this PR.

Thank you very much,
Shai

@jeffdonahue
Copy link
Contributor

Thanks @shaibagon. I'll merge since one of the Travis checks passed and the others only failed due to the unrelated Travis issues.

jeffdonahue added a commit that referenced this pull request Feb 28, 2016
supporting N-D Blobs in Dropout layer Reshape
@jeffdonahue jeffdonahue merged commit c2769c1 into BVLC:master Feb 28, 2016
@shaibagon
Copy link
Member Author

@jeffdonahue Thank you!

@shaibagon shaibagon deleted the drop_nd_blobs branch February 28, 2016 08:33
fxbit pushed a commit to Yodigram/caffe that referenced this pull request Sep 1, 2016
supporting N-D Blobs in Dropout layer Reshape
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants