Skip to content

[feature request] More methods for PackedSequence #8921

@dmitriy-serdyuk

Description

@dmitriy-serdyuk

As a person who works a lot with recurrent networks and sequences, I wish it was easier to work with PackedSequence. I frequently find myself packing/unpacking sequence to perform some simple operation such as

sequence, lengths = pad_packed_sequence(input, batch_first=True)
sequence = elementwise_function(sequence)
pack_padded_sequence(sequence, lengths, batch_first=True)

This complicates code, introduces bugs (setting batch_first incorrectly), and makes me write two versions of code for tensors and sequences.

PaddedSequence can be extended to act more like torch.Tensor. More specifically, it can be extended with following methods (I tried to choose ones which make sense):

Element-wise unary operations (wrt to the tensor)

  • abs, acos, asin, atan, atan2, ceil, clamp, contiguous, cos, cosh, exp, expm1, frac, log, log10, log1p, log2, mul, pow, reciprocal, neg, renorm, round, rsqrt, sigmoid, sign, sin, sinh, sqrt, tan, tanh, trunc
  • fill_, pin_memory

Boolean functions

  • is_contiguous, is_pinned

Element-wise binary operations

Two sequences can be added, subtracted, compared, etc when they have same shape and lengths.

  • add, div, eq, fmod, ge, gt, le, lt, ne, remainder, sub
  • map_

Autograd functions

  • detach

cc @albanD @mruberry @jbschlosser

Metadata

Metadata

Assignees

No one assigned

    Labels

    featureA request for a proper, new feature.module: nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions