Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] SparseVector #11424

Closed
wants to merge 4 commits into from
Closed

[WIP] SparseVector #11424

wants to merge 4 commits into from

Conversation

lindahua
Copy link
Contributor

This is based on the discussion in #11324. This PR remains WIP, and it is doing the following:

  • Add the SparseVector type.
  • Add methods of the SparseVector type.
  • Add methods that interact with both SparseMatrix and SparseVector.
  • Add relevant linear algebraic methods.
  • Add a complete test suite.
  • Add documentation.

@tkelman tkelman added the domain:arrays:sparse Sparse arrays label May 24, 2015
@ViralBShah
Copy link
Member

We will also need methods for AbstractMatrix and SparseVector. It would be nice to have some ways to iterate over both sparse matrices and vectors with the same API.

@mlubin
Copy link
Member

mlubin commented May 24, 2015

As @StefanKarpinski suggested, let's make sure we have complete code coverage for this.

@lindahua
Copy link
Contributor Author

Depending on the decision in #11408

@lindahua
Copy link
Contributor Author

The last commit of this PR passed tests (as shown by Travis-ci). The appveyor issue is unrelated. Of course, more tests are to be added.

@tkelman
Copy link
Contributor

tkelman commented May 26, 2015

The AppVeyor failure is not unrelated. I can reproduce it by running make testall1 locally on this branch, even on 64-bit Linux. It's most likely some dumb variable name conflict.

@lindahua
Copy link
Contributor Author

lindahua commented Jun 3, 2015

Note: I am now working towards the NIPS 2015 submission deadline on June 6. Will resume the efforts here afterwards.

@tkelman
Copy link
Contributor

tkelman commented Jun 3, 2015

Thanks for the update Dahua. Good luck with your submission.

@tkelman
Copy link
Contributor

tkelman commented Jun 5, 2015

#11583 would be good to address here, once you get back to this.

@ViralBShah ViralBShah added this to the 0.5 milestone Jun 5, 2015
@ViralBShah ViralBShah mentioned this pull request Jun 12, 2015
13 tasks
@ViralBShah
Copy link
Member

@lindahua Do you think we can do this in the next couple of weeks - which would make it possible to include in 0.4?

@lindahua
Copy link
Contributor Author

I don't think I can have much time coding Julia in the coming month, as I have to kick start some new research projects. However, this is not a very difficult tasks, and I wish people interested in this PR can continue the efforts ...

@tkelman tkelman self-assigned this Jun 23, 2015
@lindahua
Copy link
Contributor Author

I am now resuming the efforts on this.

@tkelman tkelman removed their assignment Jul 27, 2015
@tkelman
Copy link
Contributor

tkelman commented Jul 27, 2015

I have a cleaned-up branch of this at https://github.com/JuliaLang/julia/tree/tk/sparsevec that you should start from, that at least passes tests. It doesn't address the unchecked items in the list up top though.

@lindahua
Copy link
Contributor Author

Thanks, I will look into that.

@lindahua
Copy link
Contributor Author

This PR is going to change the semantics of several functions (e.g. sparsevec will return an instance of SparseVector instead of SparseMatrixCSC) and may possibly break some client codes. Given that we are now very late in the 0.4 cycle, I think it is better to make this happen post-0.4.

Also, many more functions are being actively implemented, tested, and tuned in the SparseVectors package.

I am closing this, and will make another PR, when:

  • Julia 0.4 is released; and
  • The package SparseVectors already implements all the functions needed to be transferred to the Base.

@lindahua lindahua closed this Jul 28, 2015
@tkelman
Copy link
Contributor

tkelman commented Jul 28, 2015

How do you plan on making the SparseVectors package correctly implement things that are currently buggy in base? Non-exported replacements like SparseVectors.sparsevec ?

@lindahua
Copy link
Contributor Author

I am now adding sparsevector to the package, which plays a similar role as sparsevec but returns a SparseVector instance. This function will be renamed to sparsevec when transferred to the Base.

@tkelman
Copy link
Contributor

tkelman commented Jul 28, 2015

Sounds reasonable. Will the package try to change base's indexing rules, returning a SparseVector from column slicing of a SparseMatrixCSC? I think that's the biggest most important change here that would be worth the possible breakage no matter when we decide to do it, but may as well do this right instead of rushing it.

@lindahua
Copy link
Contributor Author

Sure, will explore the slicing stuff as next major step in the package development.

Currently, we have view(A, :, i) that returns a SparseVectorView instance when A is a sparse matrix and i is an integer. But will of course do more later.

@mbauman
Copy link
Sponsor Member

mbauman commented Sep 23, 2015

Hearty bump!

Any chance this could be revisited in the near future? Missing a real vector makes indexing into sparse matrices wonky. And it only gets stranger if we drop scalar dimensions — because then S[1,:] becomes… a 1-column matrix?

@lindahua
Copy link
Contributor Author

lindahua commented Oct 1, 2015

I figure maybe I have to say that I probably won't be able to find enough time to push forward this PR before mid November.

The SparseVectors package has been quite stable. Anyone who are willing to push forward this please feel free to make a PR based on the whole / part / or none of that package.

@mbauman mbauman mentioned this pull request Oct 3, 2015
11 tasks
@tkelman tkelman deleted the dh/spvec branch October 4, 2015 07:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants