-
Notifications
You must be signed in to change notification settings - Fork 25.2k
Open
Labels
featureA request for a proper, new feature.A request for a proper, new feature.module: nnRelated to torch.nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
As a person who works a lot with recurrent networks and sequences, I wish it was easier to work with PackedSequence
. I frequently find myself packing/unpacking sequence to perform some simple operation such as
sequence, lengths = pad_packed_sequence(input, batch_first=True)
sequence = elementwise_function(sequence)
pack_padded_sequence(sequence, lengths, batch_first=True)
This complicates code, introduces bugs (setting batch_first
incorrectly), and makes me write two versions of code for tensors and sequences.
PaddedSequence
can be extended to act more like torch.Tensor
. More specifically, it can be extended with following methods (I tried to choose ones which make sense):
Element-wise unary operations (wrt to the tensor)
- abs, acos, asin, atan, atan2, ceil, clamp, contiguous, cos, cosh, exp, expm1, frac, log, log10, log1p, log2, mul, pow, reciprocal, neg, renorm, round, rsqrt, sigmoid, sign, sin, sinh, sqrt, tan, tanh, trunc
- fill_, pin_memory
Boolean functions
- is_contiguous, is_pinned
Element-wise binary operations
Two sequences can be added, subtracted, compared, etc when they have same shape and lengths.
- add, div, eq, fmod, ge, gt, le, lt, ne, remainder, sub
- map_
Autograd functions
- detach
jayded, hbredin, cristian-bicheru, NProkoptsev and texttheater
Metadata
Metadata
Assignees
Labels
featureA request for a proper, new feature.A request for a proper, new feature.module: nnRelated to torch.nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module