New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
is_positive_semidefinite() gives wrong result on sympy==1.6 #19547
Comments
Can I try fixing this one? |
Yes, go ahead |
@jcpaik @oscarbenjamin The naive approach to fix this entails computing all principal minors def _is_positive_semidefinite_minors(M):
"""A method to evaluate all principal minors for testing
positive-semidefiniteness."""
size = M.rows
return all(
M[idx, idx].det(method='berkowitz').is_nonnegative
for order in range(1, size+1)
for idx in itertools.combinations(range(size), order)
) I am concerned that this will become computationally infeasible for large |
I added a check on the eigenvalues to handle larger matrices, which seems to be much faster than computing minors. return all(eigenvalue.is_nonnegative for eigenvalue in M.eigenvals()) Not sure how it will work for symbolic matrices though. |
Computing the eigenvalues can be very slow for symbolic matrices and is in general not possible for matrices larger than 4x4 where there are symbols in the matrix coefficients. This is because of the Abel-Ruffini theorem: I thought that the method use here was based on this: |
Sylvestre's criterion does work. But this detail
means we need to check more minors of the matrix, increasing the time complexity from polynomial to (I think) exponential. Maybe this is not a huge deal. I'm running some timing benchmarks to get a sense for how far this method can go, and I'll let you make a call from there. |
Here is some rough timing data from random import random
from sympy import Matrix
from sympy.matrices.eigen import _is_positive_semidefinite_by_minors
def check_random_positive_semidefinite_matrix(n):
A = Matrix([[random() for i in range(n)] for j in range(n)])
S = A.T * A
return _is_positive_semidefinite_by_minors(S) >>> n_runs = 10
>>> avg_time_5 = timeit(lambda: check_random_positive_semidefinite_matrix(5), number=n_runs) / n_runs
>>> avg_time_5
0.033390344999997976
>>> avg_time_10 = timeit(lambda: check_random_positive_semidefinite_matrix(10), number=n_runs) / n_runs
>>> avg_time_10
7.595623607399989
>>> n_runs = 5
>>> avg_time_12 = timeit(lambda: check_random_positive_semidefinite_matrix(12), number=n_runs) / n_runs
>>> avg_time_12
53.047399703599964
|
Do you think this is reasonable given the size of the matrices in most use cases? |
For comparison, here is the time required to check the eigenvalues are non-negative: >>> from sympy.matrices.eigen import _is_positive_semidefinite_by_eigenvalues
>>> def check_random_positive_definite_matrix_by_eigenvalues(n):
... A = Matrix([[random() for i in range(n)] for j in range(n)])
... S = A.T * A
... return _is_positive_semidefinite_by_eigenvalues(S)
...
>>> check_random_positive_definite_matrix_by_eigenvalues(10)
True
>>> n_runs = 5
>>> evals_avg_time_12 = timeit(lambda: check_random_positive_definite_matrix_by_eigenvalues(12), number=n_runs) / n_runs
>>> evals_avg_time_12
0.5310900253998625 |
I'm thinking maybe this is beyond the scope of the issue, and there are probably edge cases I'm not considering. I'll open a PR that just corrects Sylvestre's criterion. |
The suggestion here: That's already implemented although the implementation strictly requires positive definite rather than semidefinite. Maybe that could be changed. In [5]: A = Matrix([[0,0,0],[0,1,2],[0,2,1]])
In [6]: A.cholesky()
---------------------------------------------------------------------------
NonPositiveDefiniteMatrixError Traceback (most recent call last)
<ipython-input-6-000abf011399> in <module>
----> 1 A.cholesky()
~/current/sympy/sympy/sympy/matrices/dense.py in cholesky(self, hermitian)
258
259 def cholesky(self, hermitian=True):
--> 260 return _cholesky(self, hermitian=hermitian)
261
262 def LDLdecomposition(self, hermitian=True):
~/current/sympy/sympy/sympy/matrices/decompositions.py in _cholesky(M, hermitian)
270
271 if Lii2.is_positive is False:
--> 272 raise NonPositiveDefiniteMatrixError(
273 "Matrix must be positive-definite")
274
NonPositiveDefiniteMatrixError: Matrix must be positive-definite
> /Users/enojb/current/sympy/sympy/sympy/matrices/decompositions.py(272)_cholesky()
270
271 if Lii2.is_positive is False:
--> 272 raise NonPositiveDefiniteMatrixError(
273 "Matrix must be positive-definite")
274 If that check for |
There is cholesky decomposition with pivoting that can work as an extension for the positive-definite cholesky Although I'm not sure that the existance of this type of cholesky decomposition asserts that the matrix is positive semidefinite, I'd bet that it's likely. |
Thank you. I will take a look at those resources and the existing implementation of |
I’d reopen this issue to discuss about how to address the issue of slowdowns. So I may eventually revert to eigenvalue computation if nobody comes up with a good idea about performance because at least it won’t be worse than what was there at the first place. |
I would suggest a mixed approach that uses Sylvestre's criterion as a fallback if another method does not work. A quick performance improvement would be to use eigenvalues like in bc4f42c, but we would need to investigate edge cases. Out of curiosity, does this need to support matrix computations over a finite field? |
I don't think that we have any rigorous framework for finite field computation yet. So this doesn't need to be resolved in this issue. |
#19573 seems to have resolved this -- correct me if I'm wrong |
The code
gives a wrong result
True
for sympy version 1.6. The result isFalse
for version 1.5.1.The reason for this is because the following function introduced in #19205 only checks the leading diagonal minors, thus failing to check det([[1, 2],[2,1]]) < 0.
sympy/sympy/matrices/eigen.py
Lines 744 to 753 in bfd2fa0
The text was updated successfully, but these errors were encountered: