You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now there is some basic support and proof of concept tools for working with sparse matrices in SciJS, but it is not well optimized yet.
In practice, there are really 3 widely used sparse formats:
List of lists
Hash tables
Compressed sparse row/column
The first 2 are generally used only for constructing sparse matrices, and so they don't have to be very fast. The last one though requires some more serious consideration.
One interesting observation is that compressed sparse row/column matrices are effectively transposes of one another. In fact, I believe that we can realize the same benefits of CSR(C) formats in ndarray by just using run length encoding, which is equivalent. It also is nice as all the usual ndarray transpose/slice/step operations would work without any changes. Using run length encoding for a default sparse format would give a general and efficient solution, which could be useful in iterative solvers or other applications which would need high performance tensor operations. Finally, RLE could also be useful in working with large image/voxel data which is difficult to compress into the limited heap size of JavaScript.
I have done some preliminary work implementing this as a proof-of-concept, so the idea seems sound, but more work is necessary to get this integrated and working smoothly across the whole ecosystem:
I don't have strong feelings on one of these schemes vs RLE encoding, except that, for example, the CCS scheme stores row information that prevents you having to search in order to locate specific rows. When I tried this a long time ago, the sparse LU preconditioner was moderately challenging since you had to perform operations for which the storage scheme wasn't designed like traversing the matrix in the non-indexed direction. I think the marginal challenge of CRS/CCS over RLE is perhaps outweighed by the potential benefits when it comes time to implement algorithms that go beyond just matrix-vector multiplication, but just an opinion. 😃
Right now there is some basic support and proof of concept tools for working with sparse matrices in SciJS, but it is not well optimized yet.
In practice, there are really 3 widely used sparse formats:
The first 2 are generally used only for constructing sparse matrices, and so they don't have to be very fast. The last one though requires some more serious consideration.
One interesting observation is that compressed sparse row/column matrices are effectively transposes of one another. In fact, I believe that we can realize the same benefits of CSR(C) formats in ndarray by just using run length encoding, which is equivalent. It also is nice as all the usual ndarray transpose/slice/step operations would work without any changes. Using run length encoding for a default sparse format would give a general and efficient solution, which could be useful in iterative solvers or other applications which would need high performance tensor operations. Finally, RLE could also be useful in working with large image/voxel data which is difficult to compress into the limited heap size of JavaScript.
I have done some preliminary work implementing this as a proof-of-concept, so the idea seems sound, but more work is necessary to get this integrated and working smoothly across the whole ecosystem:
The text was updated successfully, but these errors were encountered: