Skip to content

Commit

Permalink
Fix typo in compile docstring regarding default cache_size_limit (#…
Browse files Browse the repository at this point in the history
…125145)

Docstring of `torch.compile` specifies that default `torch._dynamo.config.cache_size_limit` equals to `64`, while the value is `8` in the corresponding py file.

Pull Request resolved: #125145
Approved by: https://github.com/kit1980
  • Loading branch information
Ghelfi authored and pytorchmergebot committed Apr 29, 2024
1 parent 8c21925 commit 26f8d96
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion torch/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -1806,7 +1806,7 @@ def compile(model: Optional[Callable] = None, *,
results are not applicable for subsequent calls (this is called a "guard
failure), you can use TORCH_LOGS=guards to debug these situations.
Multiple compiled results can be associated with a frame up to
``torch._dynamo.config.cache_size_limit``, which defaults to 64; at which
``torch._dynamo.config.cache_size_limit``, which defaults to 8; at which
point we will fall back to eager. Note that compile caches are per
*code object*, not frame; if you dynamically create multiple copies of a
function, they will all share the same code cache.
Expand Down

0 comments on commit 26f8d96

Please sign in to comment.