Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/logeuclid mean and logeuclid distance to convex hull #244

Merged
merged 90 commits into from
Feb 27, 2024

Conversation

gcattan
Copy link
Collaborator

@gcattan gcattan commented Feb 18, 2024

This PR implements the logeuclid mean with constraint programming model, and fixes logeuclid distance to convex hull.

@toncho11
Copy link
Collaborator

Do you have first results?

@gcattan
Copy link
Collaborator Author

gcattan commented Feb 18, 2024

Yes, and it is not working. I guess it will not be that easy to implement ^^

@gcattan
Copy link
Collaborator Author

gcattan commented Feb 19, 2024

@qbarthelemy what do you think of this implementation. Am I missing something?

@toncho11
Copy link
Collaborator

toncho11 commented Feb 19, 2024

@qbarthelemy So it is as the previous implementation but now logm is applied on the COV matrices, then the definition with constrain programming and in the end expm. So it like in pyRiemann:
https://github.com/pyRiemann/pyRiemann/blob/9214eb3e1d00c51210c2886cc7e5fc8a54dbc7ef/pyriemann/utils/mean.py#L332
Is it OK?

doc/api.rst Outdated Show resolved Hide resolved
doc/api.rst Outdated Show resolved Hide resolved
pyriemann_qiskit/utils/mean.py Outdated Show resolved Hide resolved
pyriemann_qiskit/utils/mean.py Outdated Show resolved Hide resolved
pyriemann_qiskit/pipelines.py Outdated Show resolved Hide resolved
pyriemann_qiskit/pipelines.py Outdated Show resolved Hide resolved
pyriemann_qiskit/classification.py Outdated Show resolved Hide resolved
pyriemann_qiskit/pipelines.py Outdated Show resolved Hide resolved
pyriemann_qiskit/utils/distance.py Outdated Show resolved Hide resolved
pyriemann_qiskit/utils/distance.py Show resolved Hide resolved
gcattan and others added 6 commits February 19, 2024 22:09
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
@gcattan
Copy link
Collaborator Author

gcattan commented Feb 19, 2024

Thanks for the reviewer @qbarthelemy . I will address the comments as soon as possible.
I did some tests today with the new log-euclidian distance.

I see a difference in the matrices, whether I do:
logm(covmats) or np.array([logm(covmat) for covmat in covmats])

The first option returns matrices with mostly zeros.

Then, when I take the exponential of the mean returned by the optimizer, it is always a diagonal matrix.

I suspect there is some issue with the optimizer. I see already that Adam is providing better results than Cobyla, when debugging with a classical optimizer.

gcattan and others added 5 commits February 26, 2024 12:31
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
@gcattan
Copy link
Collaborator Author

gcattan commented Feb 26, 2024

Hm. The equation should be minimal when the sum of the covariance matrices (Xi) leans towards Y.
I will add the constraint, but this could be an issue for QAOA, as it should only accept quadratic and unconstraint binary problems. Let's see if he can manage this.

I cannot remember why this 1 - x to be honest. It looks like the prediction of the MDM was inverted, but I cannot remember why. I moved this pre-processing outside the function for the moment.

@qbarthelemy qbarthelemy changed the title Feat/log euclid mean Feat/logeuclid mean and logeuclid distance to convex hull Feb 26, 2024
Copy link
Member

@qbarthelemy qbarthelemy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm. The equation should be minimal when the sum of the covariance matrices (Xi) leans towards Y.

Yes, you are right. But null weights give the same minimal objective.
Before merging, can you add check_weights on master version to check if weights were always null, or not?

Furthermore, the objective can become even smaller if weights are negative.
Is it possible to add in the optimizer a constraint on positivity (like in Eq(22) of Zhao2019)?

@qbarthelemy
Copy link
Member

In fact, tests fail raising the new checking Weights must be strictly positive

https://github.com/pyRiemann/pyRiemann-qiskit/actions/runs/8061607519/job/22019679900#step:7:401

@gcattan
Copy link
Collaborator Author

gcattan commented Feb 27, 2024

image

We can still set the lower bound to some minimum, such as 10^-4.

However, may be in this case, the pb is with the matrices that are generated for the test.
We can use mne data as it is done for distance.

Actually, this test here is redundant:

image

It should be located (and is already implemented in test_utils_distance.py)

@gcattan gcattan merged commit ca33847 into pyRiemann:main Feb 27, 2024
13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants