This repository was archived by the owner on Aug 1, 2025. It is now read-only.

Description
🐛 Describe the bug
Suppose you compile f(x, y) with x.requires_grad=True but y.requires_grad=False. AOTAutograd will generate a custom function that will always return zero gradient for y.
If you subsequently, somehow (e.g., via failure of guards) try calling this custom autograd function with y.requires_grad=True, AOTAutograd will SILENTLY still give zero gradient for y. No error. This causes accuracy errors. Accuracy errors are bad.
We can detect this situation by simply testing if the input require_grads are consistent with what we compiled with (if we compile an input with requires_grad=False, we MUST NOT run it with an input that requires_grad=True). This is strictly unnecessary, as dynamo is supposed to guard in this situation, but this test is pretty cheap and we should consider running it, if not putting it under debug mode.)
Error logs
No response
Minified repro
No response