You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When we use horovod backend and perform gradient accumulation, we get the following error: AssertionError: Gradients were computed more than backward_passes_per_step times before call to step(). Increase backward_passes_per_step to accumulate gradients locally.
Thus, we need to change the default argument backward_passes_per_step of horovod.DistributedOptimizer to enable gradient accumulation in the distributed setting. To do so, we can add this argument to ignite.distributed.auto_optim.