Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Update on "[Gradient Compression] Add a random generator to PowerSGD …
…state for initializing low-rank matrix Q" Previously the random seed is the length of input tensor, which is not guaranteed to be the different for different batches. Now initialize a random generator in PowerSGD state, and use this generator to create a random seed to randomize the low-rank tensor Q at every step. Therefore, the initial tensor Q should be the same across all the replicas at the same step, but different at different steps. 'torch.manual_seed' is used in the same way as https://github.com/epfml/powersgd/blob/master/gradient_reducers.py#L675 Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202 Differential Revision: [D25191589](https://our.internmc.facebook.com/intern/diff/D25191589/) [ghstack-poisoned]
- Loading branch information