Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH TransformedTargetRegressor.fit() raises if only inverse_func is provided #28483

Merged

Conversation

StefanieSenger
Copy link
Contributor

What does this implement/fix? Explain your changes.

This PR suggests to raise an error if users only provide inverse_func to TransformedTargetRegressor() without explicitely setting func as well.

It's losely connected to #28480, where I came across this issue.

Not sure if I should add a changelog entry here.

Copy link

github-actions bot commented Feb 20, 2024

✔️ Linting Passed

All linting checks passed. Your pull request is in excellent shape! ☀️

Generated for commit: fd3fd7c. Link to the linter CI: here

@StefanieSenger StefanieSenger changed the title TransformedTargetRegressor.fit() raises if only inverse_func is provided MNT TransformedTargetRegressor.fit() raises if only inverse_func is provided Feb 20, 2024
Copy link
Member

@jeremiedbb jeremiedbb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @StefanieSenger.

Currently when inverse_func is passed and func is left to None, func falls back to identity and we get a warning because both functions are not their respective inverse.

I agree that falling back to identity when inverse_func is passed is not a desirable behavior so I'm +1 to raise an error. Since it's a change of behavior, it requires an entry in the change log though.

Comment on lines 177 to 179
if (self.func is not None and self.inverse_func is None) or (
self.inverse_func is not None and self.func is None
):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can simplify this condition but I'm not sure it helps readability so feel free to ignore :)

Suggested change
if (self.func is not None and self.inverse_func is None) or (
self.inverse_func is not None and self.func is None
):
if (self.func is not None) == (self.inverse_func is None):

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that this is difficult to read and would prefer to keep the other code, if that's okay.

@StefanieSenger
Copy link
Contributor Author

Thanks @jeremiedbb, I should have explained it the way you did.
I also think that in this case we should force the user to input something explicitly. The warning then is for other inputs where func and inverse_func don't match.

Copy link
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two nits, otherwise LGTM

sklearn/compose/_target.py Outdated Show resolved Hide resolved
@jeremiedbb
Copy link
Member

Please also add a what's new entry

StefanieSenger and others added 2 commits February 23, 2024 11:53
Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
@StefanieSenger
Copy link
Contributor Author

I've applied your suggestions, @thomasjpfan and added the changelog entry, @jeremiedbb.
Thank you both for reviewing.

@StefanieSenger StefanieSenger changed the title MNT TransformedTargetRegressor.fit() raises if only inverse_func is provided ENH TransformedTargetRegressor.fit() raises if only inverse_func is provided Feb 23, 2024
@thomasjpfan thomasjpfan merged commit 43cfb18 into scikit-learn:main Feb 23, 2024
32 checks passed
@StefanieSenger StefanieSenger deleted the raise_TransformedTargetRegressor branch February 23, 2024 23:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants