Dropout across multiple GPU #19160
Unanswered
vanshilshah97
asked this question in
DDP / multi-GPU / multi-node
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is it necessary conceptually to set the same seed for multiple processes in a DDP paradigm to have the same dropout layer behavior across the processes for correct model training?
Beta Was this translation helpful? Give feedback.
All reactions