Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure sum of positive correlations between X and U is positive #27

Closed
ashnair1 opened this issue May 26, 2021 · 4 comments
Closed

Ensure sum of positive correlations between X and U is positive #27

ashnair1 opened this issue May 26, 2021 · 4 comments

Comments

@ashnair1
Copy link

ashnair1 commented May 26, 2021

Hi @mortcanty,

I had two questions regarding this section.

  1. The comment states that the aim of this section is to ensure sum of positive correlations between X and U is positive. What exactly is the purpose of this step and what is X in this case? From the book it is understood that U is the new multispectral image generated from the eigenvalue problem but it's unclear what X is.
  2. This step appears to be missing in the iMad_tf.py script. Is there a reason for this or did I miss something?

Thanks

@mortcanty
Copy link
Owner

Thanks for your interest, it is appreciated.

  1. There are ambiguities in the relative signs of the canonical transformations, but it is only essential to ensure that the canonical variates are positively correlated. When this is done, there is still an ambiguity in the relative signs which affects the overall signs of the MAD variates, but not their absolute values. The additional step is intended to remove that ambiguity, but it's not essential. X is the first image.
  2. I didn't bother to include the step in the tensorflow implementation (laziness, also the script isn't mentioned in my book).

@ashnair1
Copy link
Author

ashnair1 commented May 26, 2021

Ah I see. So this is why the outputs of the iMad.py and iMad_tf.py differ in their signs i.e. if you compare bands of iMad.py output and iMad_tf.py output their signs are sometimes different.

Where do these sign ambiguities come from though?

Edit: Oh wait, this is because of the eigenvectors isn't it? Even after normalisation your eigenvectors will still be ±x

@mortcanty
Copy link
Owner

Yep. If you're really interested, you might have a look at
https://old-www.sandia.gov/~tgkolda/pubs/pubfiles/SAND2007-6422.pdf
I haven't read it myself, I admit. :)

@ashnair1
Copy link
Author

Thanks a lot for the clarification and the reference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants