New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sparse matrix factorizations (QR, LU, SVD, etc.) #159
Comments
Commented by jcline3 on 23 Jun 41873728 16:09 UTC Is there anything aside from QR, LU, and SVD that we need? A few of the libraries I have found have some additional, possibly useful, algorithms. If it is just these three things that we care about, it may be reasonable to implement them ourselves. It's only a few hundred lines of code for all three with dense matrices, so it's probably not much more for sparse. (Admittedly, there are some slight issues that we would need to pay attention to, numerical stability, overflow and underflow...) |
Commented by rcurtin on 25 Jan 41874643 09:53 UTC |
Commented by jcline3 on 8 Nov 41877652 17:22 UTC |
Commented by rcurtin on 2 Oct 41877844 13:37 UTC |
Commented by rcurtin on 17 Nov 41877847 00:05 UTC |
Commented by rcurtin on 13 Aug 41955774 04:32 UTC |
There's no reason to leave this issue open, and it isn't really within the scope of mlpack. |
Reported by rcurtin on 5 Jan 41873650 10:03 UTC
We should be able to factorize matrices the same way Armadillo already does it for dense matrices. However, we should depend on external solvers. According to James, GMM++ solvers for the QR, LU, or SVD factorization do not exist in GMM++, so we have to look elsewhere. I suppose, then, that the first part of this task is to find external solvers that we can use for the task.
Migrated-From: http://trac.research.cc.gatech.edu/fastlab/ticket/160
The text was updated successfully, but these errors were encountered: