-
Notifications
You must be signed in to change notification settings - Fork 383
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Support for gradient accumulation with accelerate (#874)
Support for gradient accumulation with accelerate The latest accelerate release added gradient accumulation support. This requires the AccelerateMixin to call the training loop within a context manager. This is now done. Users can therefore use the gradient accumulation feature of accelerate. Furthermore, the learning rate scheduler is now also prepared if it was used with skorch's LRScheduler callback. Note: If users don't use skorch's LRScheduler callback, it cannot be prepared because there is no reliable way of detecting its use.
- Loading branch information
1 parent
d7cfdcc
commit 9bc8fe6
Showing
2 changed files
with
98 additions
and
6 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters