You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 15, 2024. It is now read-only.
I recently referenced this project to implement DDP training code, and I can successfully implement and reproduce the performance of our model, thanks to this project.
I have one question related to DDP.
Why should we set different seed per gpu with DDP?
The text was updated successfully, but these errors were encountered:
developer0hye
changed the title
Why should we set different seed per gpus with DDP?
Why should we set different seed per gpu with DDP?
Mar 3, 2022
Hi @developer0hye,
Thanks for your question,
It is to be sure that the operations where we can be led to draw random values as for instance for data-augmentations will not be exactly the same between the different GPUs.
Best,
Hugo
deit/main.py
Line 182 in 35cd455
Hi!
I recently referenced this project to implement DDP training code, and I can successfully implement and reproduce the performance of our model, thanks to this project.
I have one question related to DDP.
Why should we set different seed per gpu with DDP?
The text was updated successfully, but these errors were encountered: