Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add sparse COO tensor support to torch.sum(dim=..., keepdim=...) #92979

Closed
wants to merge 4 commits into from

Conversation

pearu
Copy link
Collaborator

@pearu pearu commented Jan 25, 2023

@pytorch-bot
Copy link

pytorch-bot bot commented Jan 25, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/92979

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 44bc03a:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

pearu added a commit that referenced this pull request Jan 25, 2023
ghstack-source-id: 25ae3601cd12d14ab08dc39685f5b8eaad6ec516
Pull Request resolved: #92979
@pearu pearu self-assigned this Jan 25, 2023
@pearu pearu added this to In progress in Sparse tensors via automation Jan 25, 2023
…m=...)"

Fixes #92757, #86232




cc alexsamardzic nikitaved cpuhrsch amjames bhosmer

[ghstack-poisoned]
pearu added a commit that referenced this pull request Jan 25, 2023
ghstack-source-id: c8213fa000e840b6eef4af218d8bd912aaa39f0b
Pull Request resolved: #92979
…m=...)"

Fixes #92757, #86232




cc alexsamardzic nikitaved cpuhrsch amjames bhosmer

[ghstack-poisoned]
pearu added a commit that referenced this pull request Jan 25, 2023
ghstack-source-id: 62812dbb00bde7e051becf5fd15a61e8eaad4bf5
Pull Request resolved: #92979
@pearu pearu requested a review from cpuhrsch January 25, 2023 17:35
Sparse tensors automation moved this from In progress to Reviewer approved Jan 25, 2023
…m=...)"

Fixes #92757, #86232




cc alexsamardzic nikitaved cpuhrsch amjames bhosmer

[ghstack-poisoned]
pearu added a commit that referenced this pull request Jan 26, 2023
ghstack-source-id: ff95011c857f9e67f1e39849cc21a5edd4be3120
Pull Request resolved: #92979
@pearu
Copy link
Collaborator Author

pearu commented Jan 26, 2023

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jan 26, 2023
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

}
if (keepdim) {
auto dim_mask = make_dim_mask(dim, self.dim());
for (int dim = 0; dim < self.dim(); dim++) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should these unsqueeze start from the last dimension instead of the first ones?

Copy link
Collaborator Author

@pearu pearu Jan 26, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, because dim_mask and the dim loop correspond to the self shape and unsqueeze inserts (read: restores) the removed dimensions at the position that is in the reference of self, not of result, shape.

Consider the following example:

>>> a = torch.tensor([[[1, 2], [3, 4]], [[5, 6], [7, 8]]])
>>> mask_dim = [True, True, False]
>>> dim = tuple(i for i in range(len(mask_dim)) if mask_dim[i])
>>> dim
(0, 1)
>>> # Using algorithm in this PR:
>>> result = a.sum(dim=dims, keepdim=False)
>>> for i in range(a.ndim):
...     result = result.unsqueeze(i) if mask_dim[i] else result
... 
>>> result
tensor([[[16, 20]]])
>>> # Using unsqueeze starting from the last dimension:
>>> result = a.sum(dim=dim, keepdim=False)
>>> for i in reversed(range(a.ndim)):
...     result = result.unsqueeze(i) if mask_dim[i] else result
... 
>>> result
tensor([[[16],
         [20]]])
>>> # The expected result:
>>> a.sum(dim=dim, keepdim=True)
tensor([[[16, 20]]])

As you can see, the algorithm implemented in this PR gives the expected result while reversing the order of unsqueeze not.

Sparse tensors automation moved this from Reviewer approved to Done Jan 26, 2023
@facebook-github-bot facebook-github-bot deleted the gh/pearu/88/head branch June 8, 2023 18:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/trunk Trigger trunk jobs on your pull request Merged module: reductions module: sparse Related to torch.sparse open source release notes: sparse release notes category
Projects
Development

Successfully merging this pull request may close these issues.

None yet

5 participants