Skip to content

Commit 7af38eb

Browse files
tianyeeTpytorchmergebot
authored andcommitted
Fix unexpected inference_mode interaction with torch.autograd.functional.jacobian (#130307)
Fixes #128264 Pull Request resolved: #130307 Approved by: https://github.com/soulitzer
1 parent dc1959e commit 7af38eb

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

docs/source/notes/autograd.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -240,9 +240,9 @@ This better runtime comes with a drawback: tensors created in inference mode
240240
will not be able to be used in computations to be recorded by autograd after
241241
exiting inference mode.
242242

243-
Enable inference mode when you are performing computations that don’t need
244-
to be recorded in the backward graph, AND you don’t plan on using the tensors
245-
created in inference mode in any computation that is to be recorded by autograd later.
243+
Enable inference mode when you are performing computations that do not have
244+
interactions with autograd, AND you don’t plan on using the tensors created
245+
in inference mode in any computation that is to be recorded by autograd later.
246246

247247
It is recommended that you try out inference mode in the parts of your code
248248
that do not require autograd tracking (e.g., data processing and model evaluation).

0 commit comments

Comments
 (0)