Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

index_put_ take min when there are repeated indices #19197

Open
TaoHuUMD opened this issue Apr 12, 2019 · 2 comments
Open

index_put_ take min when there are repeated indices #19197

TaoHuUMD opened this issue Apr 12, 2019 · 2 comments
Labels
enhancement Not as big of a feature, but technically not a bug. Should be easy to fix low priority We're unlikely to get around to doing this in the near future module: advanced indexing Related to x[i] = y, index functions triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@TaoHuUMD
Copy link

TaoHuUMD commented Apr 12, 2019

馃殌 Feature

Assume we want to get this:
image[x,y]=p, where x, y, p are lists. For example,

x=[1, 1, 2]

y=[3, 3, 4]

p=[1, 2, 3]

It should be mentioned that image[x][y] may be updated twice, since x[0]=x[1]=1, y[0]=y[1]=3, but p[0]=1, p[1]=2. In this case, image[1][3] should be the minimum, 1.

But it seems that torch.where does not consider that one element in list can be updated twice or more.

Is there an efficient way to do this? Better use OP of Pytorch. I can update image in a for loop, but it鈥檚 too slow, since the len(x) is big.

Motivation

Pitch

Alternatives

Additional context

@ezyang ezyang changed the title In torch.where, whether this case is considered. index_put_ handling of repeated indices Apr 14, 2019
@ezyang ezyang changed the title index_put_ handling of repeated indices index_put_ take min when there are repeated indices Apr 14, 2019
@ezyang
Copy link
Contributor

ezyang commented Apr 14, 2019

First a quick clarification: you're talking about advanced indexing a.k.a. index_put_, not where_. The documentation of index_put_ says what the behavior is supposed to be in this case:

If accumulate is True, the elements in tensor are added to self. If accumulate is False, the behavior is undefined if indices contain duplicate elements.

So unfortunately, if you want to take the min, you are currently out of luck. I can't think of a good way of do this with our current API, so I'll leave this open as a feature request (I don't know how efficiently implementable this would be, we would need an "atomicMin" operation.) If you ask on https://discuss.pytorch.org/ others may have suggestions about how to efficiently implement this API in userland. Also, it would be helpful to say why you want the min in this case.

@ezyang ezyang added enhancement Not as big of a feature, but technically not a bug. Should be easy to fix module: advanced indexing Related to x[i] = y, index functions low priority We're unlikely to get around to doing this in the near future triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Apr 14, 2019
@Chen-Suyi
Copy link

It may be too late for you now, but I did find a solution with the current PyTorch API, following the clue of "atomicMin" operation. The function you need should be TORCH.TENSOR.SCATTER_REDUCE_

The following example may help you:

x=[1, 1, 2]
y=[3, 3, 4]
p=[1, 2, 3]
shape = image.shape
index = x * shape[1] + y
image = image.flatten()
image.scatter_reduce(dim=0, index=index, src=p, reduce="amin", include_self=False)
image = image.reshape(shape)

.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Not as big of a feature, but technically not a bug. Should be easy to fix low priority We're unlikely to get around to doing this in the near future module: advanced indexing Related to x[i] = y, index functions triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

3 participants