Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove TargetTransformer class #2833

Merged
merged 6 commits into from
Sep 23, 2021
Merged

Remove TargetTransformer class #2833

merged 6 commits into from
Sep 23, 2021

Conversation

bchen1116
Copy link
Contributor

fix #2687

@bchen1116 bchen1116 self-assigned this Sep 22, 2021
@codecov
Copy link

codecov bot commented Sep 22, 2021

Codecov Report

Merging #2833 (ec40190) into main (20a00c8) will not change coverage.
The diff coverage is 100.0%.

Impacted file tree graph

@@          Coverage Diff          @@
##            main   #2833   +/-   ##
=====================================
  Coverage   99.8%   99.8%           
=====================================
  Files        297     297           
  Lines      27741   27741           
=====================================
  Hits       27664   27664           
  Misses        77      77           
Impacted Files Coverage Δ
...l/pipelines/components/transformers/transformer.py 100.0% <ø> (ø)
evalml/pipelines/component_graph.py 99.8% <100.0%> (-<0.1%) ⬇️
...ents/transformers/preprocessing/log_transformer.py 100.0% <100.0%> (ø)
...transformers/preprocessing/polynomial_detrender.py 97.7% <100.0%> (+0.2%) ⬆️
evalml/tests/component_tests/test_components.py 99.3% <100.0%> (-<0.1%) ⬇️
...valml/tests/pipeline_tests/test_component_graph.py 99.9% <100.0%> (+0.1%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 20a00c8...ec40190. Read the comment docs.

Copy link
Contributor

@angela97lin angela97lin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Love these cleanup PRs, keep them coming 😂 ! LGTM, thank you! 👍

@@ -1631,13 +1628,13 @@ def test_component_modifies_feature_or_target():
for component_class in all_components():
if (
issubclass(component_class, BaseSampler)
or issubclass(component_class, TargetTransformer)
or hasattr(component_class, "inverse_transform")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Open question: Do you think this will hold true long term? It's true for our components now but... 👀

Copy link
Contributor Author

@bchen1116 bchen1116 Sep 22, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I do think that if we have an inverse_transform method, then it'll modify the target rather than the features. It doesn't seem to make sense to have a transformer have an inverse_tranform method for the features. I also don't think it'd make sense to introduce an inverse_transform method if we don't alter the target at all. Logically, it seems like this should still hold long term.

Copy link
Contributor

@freddyaboulton freddyaboulton Sep 23, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add inverse_transform as a method to Transformer? The "default" behavior is a no-op and if we want to enforce that transformers that modify the target override that behavior we can do something like

def inverse_transform(self, y):
    if self.modifies_target:
        raise NotImplementedError("Transformers that modify the target must override inverse_transform")
    return y

That way this behavior always holds. Or we don't make this change? Maybe I was being crazy when I filed the issue lol

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think Freddy's suggestion is a good one!

Copy link
Contributor

@chukarsten chukarsten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@bchen1116 bchen1116 merged commit 5e62757 into main Sep 23, 2021
@chukarsten chukarsten mentioned this pull request Oct 1, 2021
@freddyaboulton freddyaboulton deleted the bc_2687_targettransformer branch May 13, 2022 15:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Delete TargetTransformer class
4 participants