Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FSDP] Propagate requires_grad attribute to unsharded params #109892

Closed
wants to merge 1 commit into from

Commits on Sep 22, 2023

  1. [FSDP] Propagate requires_grad attribute to unsharded params (pytorch…

    …#109892)
    
    Summary:
    
    This preserves `requires_grad` in the case where all parameters within a `FlatParameter` have the same `requires_grad` value.
    
    Currently, unsharded parameters have `requires_grad=True` in some cases where the `FlatParameter` and all original parameters have `requires_grad=False`.
    
    This could be extended to support `FlatParameters` with a mix of `requires_grad` states by extending `ParamInfo` to capture `requires_grad` for each parameter.
    
    Test Plan: test added
    
    Reviewed By: awgu
    
    Differential Revision: D49517155
    edpizzi authored and facebook-github-bot committed Sep 22, 2023
    Configuration menu
    Copy the full SHA
    168c5d2 View commit details
    Browse the repository at this point in the history