BUG: reference count leak when using THPLayout_New
and THPMemoryFormat_New
(static analyzer reports)
#77839
Labels
module: memory usage
PyTorch is using more memory than it should, or it is leaking memory
module: python frontend
For issues relating to PyTorch's Python frontend
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Hint 1: In function
THPLayout_New
andTHPMemoryFormat_New
, thetp_alloc
method is used to allocate the memory for the PyObject to be returned. If the default allocator function is inherited, the function call will return a new reference.Hint 2: Function
PyModule_AddObject
will steal a reference to the third argument only when the return value is zero.initializeLayouts
, the trigger path provided by our analyzer is as follows. (Internal Report ID: e2a925)A new reference is returned from
THPLayout_New
and assigned tostrided_layout
. (refcnt = 1)pytorch/torch/csrc/utils/tensor_layouts.cpp
Line 25 in d1bb420
Increase the refcnt. (refcnt = 2)
pytorch/torch/csrc/utils/tensor_layouts.cpp
Line 26 in d1bb420
An exception is thrown without decreasing the refcnt.
pytorch/torch/csrc/utils/tensor_layouts.cpp
Line 28 in d1bb420
add_memory_format
in functioninitializeMemoryFormats
, the trigger path provided by our analyzer is as follows. (Internal Report ID: 09d8df)A new reference is returned from
THPMemoryFormat_New
and assigned tomemory_format
. (refcnt = 1)pytorch/torch/csrc/utils/tensor_memoryformats.cpp
Line 32 in d1bb420
Increase the refcnt. (refcnt = 2)
pytorch/torch/csrc/utils/tensor_memoryformats.cpp
Line 33 in d1bb420
Decrease the refcnt. (refcnt = 1)
pytorch/torch/csrc/utils/tensor_memoryformats.cpp
Line 35 in d1bb420
An exception is thrown without decreasing the refcnt.
pytorch/torch/csrc/utils/tensor_memoryformats.cpp
Line 36 in d1bb420
cc @ezyang @gchanan @zou3519
The text was updated successfully, but these errors were encountered: