Skip to content

Conversation

@pearu
Copy link
Collaborator

@pearu pearu commented Sep 27, 2020

Fixes #44635.

@pearu pearu added module: sparse Related to torch.sparse module: docs Related to our documentation, both in docs/ and docblocks open source labels Sep 27, 2020
@pearu pearu self-assigned this Sep 27, 2020
@pearu pearu changed the title Revise sparse tensor documentation. Revised sparse tensor documentation. Sep 28, 2020
Copy link
Collaborator

@hameerabbasi hameerabbasi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A first review pass.

Copy link

@willwray willwray left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only spotted a single sentence; suggested change in comment

@pearu pearu marked this pull request as ready for review September 28, 2020 18:53
@zhangguanheng66 zhangguanheng66 added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Sep 29, 2020
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That looks very good.
Thanks for taking the time to do this.
I added small comments inline.

@pearu pearu requested a review from albanD September 30, 2020 19:44
@mruberry mruberry self-requested a review September 30, 2020 21:37
@pearu
Copy link
Collaborator Author

pearu commented Oct 2, 2020

@vadimkantorov , could you also review this PR which is about updating the torch.sparse documentation page?

@pytorch pytorch deleted a comment from codecov bot Oct 2, 2020
@pearu pearu force-pushed the pearu/44635 branch 2 times, most recently from a30dee7 to 4e46790 Compare October 2, 2020 18:30
@pearu
Copy link
Collaborator Author

pearu commented Oct 4, 2020

This PR is ready for review. Latest updates include:

^ @albanD @ngimel @mruberry @ezyang @rgommers @hameerabbasi

@ezyang
Copy link
Contributor

ezyang commented Oct 7, 2020

Therefore, I would suggest not to implement any API changes within this PR but discuss these in a separate issue/PR/RFC. What do you think?

I didn't get the sense from Mike's comments that he wanted substantive code changes in this PR, but let me just reiterate: there should be no code changes here, the docs here are As Is. That being said, if there are some weird parts of the API that we'd plan on just fixing later, we shouldn't be as nitpicky about the docs for those parts in this PR. (Not sure how much is covered by that).

@mruberry
Copy link
Collaborator

mruberry commented Oct 7, 2020

  • English and various notation fixes - this I can tackle according to your review immediately

Sounds good. I hope they're helpful.

  • overall structure of the sparse docs - it will be an iterative process that needs to be continued

Sure.

  • Sparse tensor API - how much of API backward compatibility we are required to keep? I think some API changes should be discussed by creating the corresponding RFCs rather than implementing these in this PR. For instance, the to_dense and to_sparse methods could be merged into one method, to or to_layout, because of the emergence of new sparse format implementations, or use another approach that includes introducing methods like to_strided, to_sparse_coo, to_sparse_gcs, etc. This PR is about updating sparse documentation according to the current state (because this is what the current users are faced to) and, when possible, take into account possible future developments. Therefore, I would suggest not to implement any API changes within this PR but discuss these in a separate issue/PR/RFC. What do you think?

I understand this PR is just updating the documentation and I agree with @ezyang that it shouldn't include code changes. The API-related questions are to help me understand the language, goals, and motivations of our sparse story so we can be consistent with our documentation. I would actually suggest this PR hew closer to the current state of the code, as mentioned previously, and maybe not try to provide an organizational structure for a new sparse format or mention the concept of a fill value as anything but zero. Following @ezyang's point, where the docs or current functions do exhibit wonky behavior (like sometimes inferring a non-zero fill value) maybe we should just address those discrepancies as straightforwardly as possible. For example, for softmax we could just add a line saying the function conceptually treats unspecified values as negative infinity, and not as zeros.

@ezyang
Copy link
Contributor

ezyang commented Oct 12, 2020

Hey, can we agree on who is shepherding this PR? @albanD, @mruberry and I have all done substantial comments; who is going to be in charge of making sure this actually lands?

@pearu
Copy link
Collaborator Author

pearu commented Oct 12, 2020

FYI, I am in a process of rewriting sparse.rst based on @mruberry feedback.

@mruberry
Copy link
Collaborator

Hey, can we agree on who is shepherding this PR? @albanD, @mruberry and I have all done substantial comments; who is going to be in charge of making sure this actually lands?

I got it.

@pearu pearu marked this pull request as draft October 13, 2020 13:37
@pearu pearu marked this pull request as ready for review October 14, 2020 11:33
@hameerabbasi
Copy link
Collaborator

Please feel free to re-request for review once you're done working on this.

facebook-github-bot pushed a commit that referenced this pull request Oct 15, 2020
Summary:
Fixes #45113

Description:
- Fixed bug in sspaddmm by calling contiguous on indices.
- Added tests

We have to make indices contiguous as we use `indices.data_ptr` in `_to_csr` which assumes row-contiguous storage:
https://github.com/pytorch/pytorch/blob/be45c3401af8186f97f0e2b269ff3bafaf16157f/aten/src/ATen/native/sparse/SparseTensorMath.cpp#L1087-L1090

> Part 1 of fixing this is probably to document sspaddmm. Part 2 may be to rewrite it using other ops. (#45113 (comment))

- Docs will be written here: #45400

Pull Request resolved: #45963

Reviewed By: malfet

Differential Revision: D24335599

Pulled By: ngimel

fbshipit-source-id: 8278c73a1b4cccc5e22c6f3818dd222588c46b45
@mruberry mruberry self-requested a review October 16, 2020 09:36
Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a great, significant improvement to our sparse documentation. Thank you for taking the time to develop these docs, @pearu! Writing is hard and time consuming.

There are still a few grammatical issues and rough edges, but I think those are best addressed going forward.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@mruberry merged this pull request in 905ed3c.

@pearu pearu deleted the pearu/44635 branch October 22, 2020 15:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: docs Related to our documentation, both in docs/ and docblocks module: sparse Related to torch.sparse open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

rewrite the torch.sparse main doc page