Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slicing tensors with negative steps #604

Closed
MaximumEntropy opened this issue Jan 27, 2017 · 6 comments

Comments

@MaximumEntropy
Copy link
Contributor

@MaximumEntropy MaximumEntropy commented Jan 27, 2017

It'd be nice to be able to slice tensors using a negative step such as tensor[::-1] or tenso[:, ::-1, :] like what is possible with numpy or theano. It'll make implementing things like bidirectional RNNs (without using the inbuilt RNN modules) simpler.

@fmassa

This comment has been minimized.

Copy link
Member

@fmassa fmassa commented Jan 27, 2017

Allowing negative strides is already being tracked in #229

@apaszke

This comment has been minimized.

Copy link
Member

@apaszke apaszke commented Jan 27, 2017

I agree, but our C backends will require a few, possibly nontrivial, changes. I'm closing this since it's a duplicate.

@apaszke apaszke closed this Jan 27, 2017
@MaximumEntropy

This comment has been minimized.

Copy link
Contributor Author

@MaximumEntropy MaximumEntropy commented Jan 28, 2017

Oops I didn't look carefully enough apparently. Apologies.

@el3ment

This comment has been minimized.

Copy link

@el3ment el3ment commented Jul 28, 2018

@apaszke I think this issue might deserve to be reopened. torch.flip isn't quite the same thing as negative striding or negative stepping -- torch.flip produces an (efficient) copy, but negative striding wouldn't need to perform a copy at all.

@soumith

This comment has been minimized.

Copy link
Member

@soumith soumith commented Aug 14, 2018

@el3ment while what you say is absolutely correct, I dont think we are planning to support negative striding anytime in the near / far future (if ever). It requires rethinking our internals and is a huge undertaking, for what we see as small perceived benefit. cc: @ezyang to double-check that our new C10 Tensor design is not incorporating negative strides.

@ezyang

This comment has been minimized.

Copy link
Contributor

@ezyang ezyang commented Aug 14, 2018

We were originally thinking that when we looked into changing TH's size/stride semantics, it might be "easy" to also make things work with negative strides, but we ended up using all our engineering budget just on making zero-size dims and scalars work correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
6 participants
You can’t perform that action at this time.