Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BatchNorm and Dropout #19

Open
dfdx opened this issue Dec 21, 2017 · 6 comments
Open

BatchNorm and Dropout #19

dfdx opened this issue Dec 21, 2017 · 6 comments

Comments

@dfdx
Copy link
Contributor

dfdx commented Dec 21, 2017

Does it make sense to put functional forms of BatchNorm and Dropout into NNlib so that other packages could simply import them from here?

@MikeInnes
Copy link
Member

Yeah, that's a great idea.

@staticfloat
Copy link
Contributor

BatchNorm is going to be tricky because of its "statistics update step". The current thinking behind this with Zygote is to do something along the lines of https://gist.github.com/staticfloat/a509b1e1cb1fb556028779722c2531e6

@ToucheSir
Copy link
Member

Now that Flux's normalization interface has been re-worked and GPU batchnorm moved from CUDA.jl -> NNlibCUDA, perhaps we should revisit this. The only reason https://github.com/FluxML/Flux.jl/tree/master/src/cuda exists at all now is to accommodate a non-standard implementation of batchnorm, so getting rid of that would be great.

@CarloLucibello
Copy link
Member

I was looking into porting the functional form of normalization layers here, but I'm not sure how to handle the Zygote.ignore block without having NNlib depend on Zygote

@DhairyaLGandhi
Copy link
Member

The concern has been raised earlier and is fixed by FluxML/Flux.jl#1509

@ToucheSir
Copy link
Member

ToucheSir commented Jun 14, 2021

I don't think there's any reason these dropout functions need to live in Flux. Shall we move them over? Happy to volunteer for a copy-paste PR if we're all in agreement. This would also unblock FluxML/Flux.jl#1572.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants