Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX KernelPCA inverse transform when gamma is not given #26337

Merged
merged 10 commits into from
May 17, 2023

Conversation

Charlie-XIAO
Copy link
Contributor

Reference Issues/PRs

Fixes #26280.

What does this implement/fix? Explain your changes.

Quote @jeremiedbb:

Originally when gamma is None, its set to 1 / n_features each time the kernel is called, not once at the beginning of fit. It means that when the kernel is called at inverse_transform, the number of features is different and hence gamma is different.

This PR intends to set a private attribute _gamma at the first fit (or fit_transform) and use _gamma instead of gamma in further transforms.

Any other comments?

Not sure if this approach is neat enough, so if maintainers want an alternative approach, please let me know!

Copy link
Contributor

@Micky774 Micky774 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there @Charlie-XIAO, thanks for the PR! Overall it looks good. I think it is worth including this as a changed model in the changelog since now the same models with gamma=None may produce different results through their inverse_transform.

Please feel free to ping me if you have any questions or concerns :)

sklearn/decomposition/tests/test_kernel_pca.py Outdated Show resolved Hide resolved
sklearn/decomposition/tests/test_kernel_pca.py Outdated Show resolved Hide resolved
@Charlie-XIAO
Copy link
Contributor Author

Hi @Micky774, thanks for your review! I've committed your suggested changes, and moved the changelog to under the "changed models". Please let me know if there are other changes I need to make, or if my wording in the changelog can be improved.

doc/whats_new/v1.3.rst Show resolved Hide resolved
sklearn/decomposition/_kernel_pca.py Outdated Show resolved Hide resolved
doc/whats_new/v1.3.rst Outdated Show resolved Hide resolved
Copy link
Contributor

@Micky774 Micky774 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@Micky774 Micky774 added Waiting for Second Reviewer First reviewer is done, need a second one! Quick Review For PRs that are quick to review labels May 10, 2023
Copy link
Member

@jeremiedbb jeremiedbb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @Charlie-XIAO. Here are some suggestions, otherwise looks good.

sklearn/decomposition/tests/test_kernel_pca.py Outdated Show resolved Hide resolved
doc/whats_new/v1.3.rst Outdated Show resolved Hide resolved
doc/whats_new/v1.3.rst Outdated Show resolved Hide resolved
sklearn/decomposition/_kernel_pca.py Outdated Show resolved Hide resolved
@Charlie-XIAO
Copy link
Contributor Author

Thanks for your review @jeremiedbb! I've made your suggested changes.

Copy link
Member

@jeremiedbb jeremiedbb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thansks @Charlie-XIAO

@jeremiedbb jeremiedbb merged commit 1a59567 into scikit-learn:main May 17, 2023
@Charlie-XIAO Charlie-XIAO deleted the kpca-inv-trans branch June 28, 2023 05:42
REDVM pushed a commit to REDVM/scikit-learn that referenced this pull request Nov 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module:decomposition Quick Review For PRs that are quick to review Waiting for Second Reviewer First reviewer is done, need a second one!
Projects
None yet
Development

Successfully merging this pull request may close these issues.

KernelPCA inverse transform behaves unexpectly.
3 participants