Skip to content

[discussion] Smarter version of torch.reshape (can avoid realloc in some cases) #28090

@vadimkantorov

Description

@vadimkantorov

Imagine a following situation:

a = torch.rand(2,3,4)
a_ = a.transpose(-1, -2)
b_ = someltwiseinplacefunc(a_.reshape(2, -1)) # reshape seems to reallocate (checked by data_ptr), view will error out
b = b_.view_as(a) # or b_.view_as(a_) (currently view_as copies dimension order, disregarding the strides, though)

Currently a_.reshape(2, -1) seems to reallocate, but it's not necessary for all operations, especially elementwise ones (given that frequently will reshape back the result). These things happen in C++ sometimes. I guess currently the solution is manual handling of strides.

If it doesn't reallocate, the semantics seems different, but it may still be worthwhile to allow for flattening of contiguous chunks of memory without reallocation and ignoring the striding dimension order.

Metadata

Metadata

Assignees

No one assigned

    Labels

    function requestA request for a new function or the addition of new arguments/modes to an existing function.module: viewing and reshapingtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions