Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add VolumetricBatchNormalization #665

Merged
merged 1 commit into from
Feb 26, 2016
Merged

Add VolumetricBatchNormalization #665

merged 1 commit into from
Feb 26, 2016

Conversation

colesbury
Copy link
Contributor

All of the BatchNormalization modules now extend nn.BatchNormalization and use the same THNN/THCUNN implementation.

@fmassa
Copy link
Contributor

fmassa commented Feb 24, 2016

I really like this PR, very clean and nicely done.
I have one question though, in the case one wants to use THNN as a standalone library: I think it's not intuitive that one should always feed a 3D tensor to BatchNormalization (even if we are in spatial or volumetric domain), and resize the output tensor afterwards so that it has the right size for the next module.
I was wondering if one couldn't change this part to compute n in a for loop over the dimensions of the input which are > 2. This solution might not apply as easily to the cunn version though, as the input dimensions are statically determined...

Maybe an easier solution (that could work for both backends) would be to do the resizing of the input/output inside the C/Cuda functions ?

@colesbury
Copy link
Contributor Author

@fmassa I think that's a good idea. The tensor:view() function is implemented in Lua but not in C. I think we can effectively do the same thing in checktensor in the CUDA version.

The BatchNormalization modules now all extend nn.BatchNormalization and
use the same THNN/THCUNN implementation.
@colesbury
Copy link
Contributor Author

Updated PR with @fmassa's suggestion. (Tests pass VolumetricMaxPooling is flaky)

soumith added a commit that referenced this pull request Feb 26, 2016
Add VolumetricBatchNormalization
@soumith soumith merged commit 6f9543d into torch:master Feb 26, 2016
@fmassa
Copy link
Contributor

fmassa commented Feb 26, 2016

Very cool, thanks @colesbury !

@hughperkins
Copy link
Contributor

Hmmm, this breaks BatchNormalization in clnn :-( For example, this means that http://torch.ch/blog/2015/07/30/cifar.html will no longer run on clnn

@fmassa
Copy link
Contributor

fmassa commented Apr 9, 2016

@hughperkins The previous commit (adding C/Cuda implementations of SpatialBatchNormalization) also is non-clnn friendly. Can't we (at least temporarily) monkey-patch clnn with the previous lua implementation ? (But we should pay attention because the meaning of some variables have changed since then)

@hughperkins
Copy link
Contributor

Yes. "we" can :-P However, it would be nice to get a little @hughperkins in any future pull request that strips out lua implementation, and replaces it with c implementation :-)

hughperkins added a commit to hughperkins/clnn that referenced this pull request Apr 9, 2016
hughperkins added a commit to hughperkins/clnn that referenced this pull request May 1, 2016
colesbury pushed a commit to colesbury/nn that referenced this pull request Feb 7, 2017
Add VolumetricBatchNormalization
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants