Skip to content
This repository was archived by the owner on Aug 1, 2025. It is now read-only.
This repository was archived by the owner on Aug 1, 2025. It is now read-only.

AOTAutograd should test that requires_grad of inputs matches its expectation #1927

@ezyang

Description

@ezyang

🐛 Describe the bug

Suppose you compile f(x, y) with x.requires_grad=True but y.requires_grad=False. AOTAutograd will generate a custom function that will always return zero gradient for y.

If you subsequently, somehow (e.g., via failure of guards) try calling this custom autograd function with y.requires_grad=True, AOTAutograd will SILENTLY still give zero gradient for y. No error. This causes accuracy errors. Accuracy errors are bad.

We can detect this situation by simply testing if the input require_grads are consistent with what we compiled with (if we compile an input with requires_grad=False, we MUST NOT run it with an input that requires_grad=True). This is strictly unnecessary, as dynamo is supposed to guard in this situation, but this test is pretty cheap and we should consider running it, if not putting it under debug mode.)

Error logs

No response

Minified repro

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions