Implementing dropout during training #66
Replies: 1 comment 11 replies
-
Hi David, The radial networks in NequIP are implemented as The other option as you mention is dropout in the equivariant feature vectors. Equivariant dropout is trickier: you can't just dropout elements from the tensor without breaking symmetry. Instead, you have to drop out/retain entire irreps. There is an implementation of this kind of dropout in You could put equivariant dropout in a lot of places. The most obvious place to me would be between convolution layers, but it could go anywhere where equivariant data is propagated. To add something like this to NequIP, I would take advantage of the I would accept a PR implementing either/both of these features if you are interested! 😃 |
Beta Was this translation helpful? Give feedback.
-
Hello,
It seems like NequIP doesn't currently support dropout. What is the right way to implement this in the source code? Also, which layers would dropout be applicable to?
From a previous discussion it seems like dropout should happen in:
- certainly in the invariant radial network MLPs
- certainly not in the chemical embedding
- certainly not in any TPs
- to discuss: in the equivariant linear self-interaction layers (shouldn’t mess with equivariance)
Thanks!
David
Beta Was this translation helpful? Give feedback.
All reactions