Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use QR decomposition in DD sharp #90

Merged
merged 7 commits into from
May 23, 2024
Merged

Use QR decomposition in DD sharp #90

merged 7 commits into from
May 23, 2024

Conversation

lukem12345
Copy link
Member

@lukem12345 lukem12345 commented May 22, 2024

Close #89

I only did a quick spot check that the LLS matrix computed via the pinv method matched the LLS matrix for a single triangle for a single dual 1-form. Those matrices were equivalent (save for a ~1e-17 entry in the pinv LLS being replaced by a 0.0 entry in the qr LLS, which I took as a good sign.)

Tests run, but certain tolerance checks no longer pass, including those operators which make use of the LLSDDSharp internally.

I need to dive into the test outputs and verify whether it is a tolerance issue, or whether it is due to vector fields no longer satisfying the tangent condition.

@lukem12345 lukem12345 added the enhancement New feature or request label May 22, 2024
@lukem12345 lukem12345 self-assigned this May 22, 2024
@lukem12345
Copy link
Member Author

lukem12345 commented May 22, 2024

Error can be introduced by realizing the inverse computed via the QR decomposition and storing it inside the sharp matrix. We can explore an option that caches the QR decomposition for each triangle, and while using a matrix with the sparsity pattern of $d_1$.

However, the interpolating musical operation $\sharp\flat$ would no longer be a single matrix.

We could resolve this by supporting all options.

@lukem12345
Copy link
Member Author

lukem12345 commented May 22, 2024

The LLS matrix computed via the qr method is a 3x3 matrix of NaNs for the tri_345 mesh.

X in this case is

julia> X
3×3 MMatrix{3, 3, Float64, 9} with indices SOneTo(3)×SOneTo(3): 
  0.5   1.33333    0.0
 -1.0  -0.666667   0.0
 -0.5   0.666667  -0.0

Observe that:

julia> QRX.R
3×3 MMatrix{3, 3, Float64, 9} with indices SOneTo(3)×SOneTo(3):
 -1.22474  -0.816497  0.0
  0.0      -1.41421   0.0
  0.0       0.0       0.0

julia> inv(QRX.R)
3×3 MMatrix{3, 3, Float64, 9} with indices SOneTo(3)×SOneTo(3):
 NaN  NaN  NaN
 NaN  NaN  NaN
 NaN  NaN  NaN

The LLS computed via the pinv method is

julia> pinv(X'*(X))*(X')
3×3 MMatrix{3, 3, Float64, 9} with indices SOneTo(3)×SOneTo(3):
 -8.63507e-17  -0.666667     -0.666667
  0.5          -5.55112e-17   0.5
  0.0           0.0           0.0

@lukem12345
Copy link
Member Author

@jpfairbanks Please advise: Do you want to use pinv instead of inv on QRX.R?

@jpfairbanks
Copy link
Member

I was expecting that qr produced the reduced Q and R factors. Would that avoid this possibility? I guess X could be rank deficient instead of overconstrained.

@jpfairbanks
Copy link
Member

The qr function docs say that you have to enable pivoting to get the thin QR decomposition. I think that using that will give you an R that is always full rank.

  The following functions are available for the QR objects: inv, size, and \. When A is rectangular, \ will return a least squares solution and if the solution is not unique, the one with smallest norm is returned. When A is not full rank, factorization
  with (column) pivoting is required to obtain a minimum norm solution.

  Multiplication with respect to either full/square or non-full/square Q is allowed, i.e. both F.Q*F.R and F.Q*A are supported. A Q matrix can be converted into a regular matrix with Matrix. This operation returns the "thin" Q factor, i.e., if A is m×n with
  m>=n, then Matrix(F.Q) yields an m×n matrix with orthonormal columns. To retrieve the "full" Q factor, an m×m orthogonal matrix, use F.Q*I or collect(F.Q). If m<=n, then Matrix(F.Q) yields an m×m orthogonal matrix.

@lukem12345
Copy link
Member Author

Ok. I've gone ahead and reduced the error tolerances in appropriate tests. (Note that the sharp matrix computed on main passes these tighter tolerances.)

@lukem12345
Copy link
Member Author

This does not appear to work. I'm not certain what the formula is to compute a realization of a pseudoinverse matrix using the pivoted QR decomposition that can be spread out into our sharp matrix.

julia> X
3×3 MMatrix{3, 3, Float64, 9} with indices SOneTo(3)×SOneTo(3):
  0.5   1.33333    0.0
 -1.0  -0.666667   0.0
 -0.5   0.666667  -0.0

julia> QRX = qr(X, ColumnNorm())
StaticArrays.QR{MMatrix{3, 3, Float64, 9}, MMatrix{3, 3, Float64, 9}, MVector{3, Int64}}([-0.816496580927726 -9.337025903963966e-18 -0.5773502691896258; 0.4082482904638631
-0.7071067811865475 -0.5773502691896257; -0.4082482904638631 -0.7071067811865475 0.5773502691896257], [-1.632993161855452 -0.6123724356957945 0.0; 0.0 1.0606601717798214 0.0; 0.0 0.0 -0.0], [2, 1, 3])

julia> QRX.R
3×3 MMatrix{3, 3, Float64, 9} with indices SOneTo(3)×SOneTo(3):
 -1.63299  -0.612372   0.0
  0.0       1.06066    0.0
0.0       0.0       -0.0
                                                                                                                                                                                                                                                                                                                              julia> inv(QRX.R)
3×3 MMatrix{3, 3, Float64, 9} with indices SOneTo(3)×SOneTo(3):
 NaN  NaN  NaN
 NaN  NaN  NaN
 NaN  NaN  NaN

@lukem12345
Copy link
Member Author

These authors provide a formula on page 5:
https://arxiv.org/abs/1102.1845

@lukem12345
Copy link
Member Author

lukem12345 commented May 22, 2024

Commit a6dabb4 realizes the pseudoinverse matrix according to that provided formula. I performed a quick spot-check by computing the LLS matrix using the method on main, and a method using the formula for the pseudoinverse via the pivoted QR decomposition. Observe that in the method on main, a particular entry is -1.90099e-17, and the corresponding entry computed via the QR method is 3.29719e-17.

julia> pinv(X'*(X))*(X')
2×3 MMatrix{2, 3, Float64, 6} with indices SOneTo(2)×SOneTo(3):
 0.21  -1.90099e-17   0.21
 0.1   -0.2          -0.1

julia> P' * pinv(QRX.R) * QRX.Q'
2×3 MMatrix{2, 3, Float64, 6} with indices SOneTo(2)×SOneTo(3):
 0.21   3.29719e-17   0.21
 0.1   -0.2          -0.1

I had to roll my own permutation matrix P because qr of an MMatrix returns only p and not P.

@lukem12345 lukem12345 marked this pull request as ready for review May 22, 2024 17:34
@lukem12345
Copy link
Member Author

@jpfairbanks This PR is ready to review.

src/DiscreteExteriorCalculus.jl Outdated Show resolved Hide resolved
@jpfairbanks
Copy link
Member

Instead of creating a permutation matrix and multiplying, you can just index into the rows of the matrix.

julia> I(4)
4×4 Diagonal{Bool, Vector{Bool}}:
 1      
   1    
     1  
       1

julia> I(4)[[2,1,3,4], :]
4×4 Matrix{Bool}:
 0  1  0  0
 1  0  0  0
 0  0  1  0
 0  0  0  1

@lukem12345
Copy link
Member Author

No problem. I've performed the operation in place now. This PR is ready for review.

@lukem12345
Copy link
Member Author

Perhaps it's just too late in the day, but I've gone ahead and replaced the calculation of the pinv with Julia's native one which uses the SVD decomposition. It feels a little wrong to be rolling our own pseudo-inverse implementation inside the CombinatorialSpaces library.

@jpfairbanks
Copy link
Member

Well, at least we aren't forming the normal equations anymore. SVD of X and using the orthogonal factors to compute the pinv is probably the most accurate thing you can do in general problems.

We should still revisit our design choice to use a matrix representation instead of a LinearOperator callable struct.

@jpfairbanks jpfairbanks dismissed their stale review May 23, 2024 01:04

Changes made

@jpfairbanks jpfairbanks merged commit 75849dc into main May 23, 2024
7 checks passed
@epatters epatters deleted the llm/sharp-qr branch May 23, 2024 04:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

QR decomposition for Least-Squares in LLS sharp
2 participants