Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GCMI in Python gives unexpected results #12

Closed
DominiqueMakowski opened this issue Jul 14, 2022 · 3 comments
Closed

GCMI in Python gives unexpected results #12

DominiqueMakowski opened this issue Jul 14, 2022 · 3 comments

Comments

@DominiqueMakowski
Copy link

Hi @robince, I just stumbled on your work and it's super interesting! I see we have somewhat related interests in complexity metrics ☺️

We recently implemented quite a lot of complexity algos in neurokit, and we also have a function to compute Mutual Info using different types of methods. I myself am not an expert at all of this, but I managed to adapt & implement some of these methods so that they are easy to use.

I think it'd be great to add GC MI, and I gave it a quick try by using your code, but unfortunately the results are somewhat unexpected. I computed the MI of two small series under different conditions of noise, using "traditional" approaches and GCMI (MI6 in the plot), but the pattern doesn't look like the others...

image

Am I missing something? Or misunderstanding how to use this function?

The code to reproduce the fig is in this PR neuropsychology/NeuroKit#677 (& here's the link to the adaptation (mostly streamlining) of your code: https://github.com/neuropsychology/NeuroKit/pull/677/files)

let me know what you think! cheers

@robince
Copy link
Owner

robince commented Jul 14, 2022

I think your signal is y=x**2 + noise. GCMI won't detect this sort of symmetric effect, it can detect only monotonic effects (it is like a rank correlation). You could try adding in abs(x) as a second dimension of x.

Also I apprecaite your interest but GCMI is licensed under the GPL so code can't be reused and released under MIT.

@DominiqueMakowski
Copy link
Author

Thanks for your reply

You could try adding in abs(x) as a second dimension of x.

I am not sure what you mean here?

GCMI is licensed under the GPL

Ah sorry I missed that. Shame, I'll close the PR then!

@robince
Copy link
Owner

robince commented Jul 14, 2022

nk.mutual_information(np.abs(x), y + noise, method="gc")

would I think be sensitive to y=x**2 effect, but requires some sort of prior knowledge (and would lose linear effects). In general if you want to have sensitivity to a symmetric effect you can create a 2d variable np.stack([x, np.abs(x-np.median(x))]) and put that as a 2d input for GCMI. There is a cost in terms of sensitivity and sampling variance for going to 2d and it's not very elegant, but just a suggestion :)

nk.mutual_information(np.stack([x, np.abs(x-np.median(x))]), y + noise, method="gc")

Sorry about the incompatible license. I have always preferred the GPL for publically funded academic work, but perhaps that view is a bit out of date now!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants