Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Ops for specialized dot products with structured matrices #1323

Open
jessegrabowski opened this issue Mar 26, 2025 · 3 comments
Open

Add Ops for specialized dot products with structured matrices #1323

jessegrabowski opened this issue Mar 26, 2025 · 3 comments
Labels
graph rewriting help wanted Extra attention is needed linalg Linear algebra performance

Comments

@jessegrabowski
Copy link
Member

Description

BLAS/LAPACK offer a number of specialized dot products when matrices have special structure:

We can implement these as non-user facing ops, and rewrite into them when we can infer the special structure of matrices in a dot Op.

@jessegrabowski jessegrabowski added graph rewriting help wanted Extra attention is needed linalg Linear algebra performance labels Mar 26, 2025
@jessegrabowski
Copy link
Member Author

I was thinking maybe we might want an assume_a property on dot? It could be hidden from the user and only used in rewrites, or maybe not. I had mentioned the possibility of something similar for lu_factor to @ricardoV94, since there are many specialized algorithms for that as well that can be useful for certain types of graphs.

The only wrinkle is that this would break API parity with numpy/scipy

@ricardoV94
Copy link
Member

ricardoV94 commented Mar 27, 2025

We should benchmark to see if it indeed buys us anything for regular use cases. Sometimes the general functions still bit the specialized just because they have been tuned way more.

More generally, I wonder if we should revisit the old Hint Op that used to exist in Theano: https://github.com/Theano/Theano/blob/7505f23edd66fcdbbe04953f53c650b4d7cec4dd/theano/sandbox/linalg/ops.py#L34-L231

The idea being that calling assume_a provides a hint about the type of input we have. Rewrites should be able to exploit this information regardless of where it originated. If an input is marked as symmetric for dot it should also be considered symmetric for a solve, even if the solve was called with the default assume_a="general".

This way the user can also wrap their inputs with an Hint without having to worry about assume_a. This of course goes in hand with machinery to propagate types over Ops.

I would probably call it specify_type instead of hint, and be precise about what are the possible hints available.

@jessegrabowski
Copy link
Member Author

I did some benchmarks here: https://gist.github.com/jessegrabowski/f194bf5cdf92937b42075027fa46f3d5

Summary: symmetrical stuff seems like it's not worth it. Triangular matrix-vector shows speedup for big matrices, but not for small matrices. Triangular matrix-matrix is a ~2x speedup. Tridiagonal matrix-vector is a huge speedup, we should definitely do that one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
graph rewriting help wanted Extra attention is needed linalg Linear algebra performance
Projects
None yet
Development

No branches or pull requests

2 participants