Skip to content

torch.memory_format are not singletons #77135

@ezyang

Description

@ezyang

🐛 Describe the bug

Memory format retrieval allocates fresh memory format with THPMemoryFormat_New

tools/autograd/templates/python_nn_functions.cpp:    PyTuple_SET_ITEM(tuple.get(), 3, THPMemoryFormat_New(opt_memory_format.value(), "unused_name"));
torch/csrc/MemoryFormat.cpp:PyObject *THPMemoryFormat_New(at::MemoryFormat memory_format, const std::string& name)
torch/csrc/MemoryFormat.h:PyObject * THPMemoryFormat_New(at::MemoryFormat memory_format, const std::string& name);
torch/csrc/utils/tensor_memoryformats.cpp:    PyObject* memory_format = THPMemoryFormat_New(format, module_name + name); \

Versions

master

cc @VitalyFedyunin @jamesr66a

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: memory formatMemory format/layout related issues/changes (channels_last, nhwc)triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions