Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MRG] Avoid uncessary copies in sklearn.preprocessing #13987

Merged
merged 3 commits into from Jun 1, 2019

Conversation

rth
Copy link
Member

@rth rth commented May 30, 2019

Partially addresses #13986

This removes the copy=True in the fit method of StandardScaler, MinMaxScaler, MaxAbsScaler, RobustScaler where it is typically not necessary to compute the scaling factors.

In practice, this makes StandardScaler().fit_transform 10%-20% faster on the few examples I have tried.

If that copy was necessary and this mistakenly removed it check_transformer_general(.., readonly_memmap=True) would fail in common tests.

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

Looks like in these specific cases inplace would have been a more descriptive parameter name than copy, and might have prevented this.

Copy link
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@thomasjpfan
Copy link
Member

Does this need a whats_new entry as an enhancement or a bug fix?

@rth
Copy link
Member Author

rth commented May 31, 2019

Thanks for the reviews! Added a what's new.

@thomasjpfan
Copy link
Member

QuantileTransformer has a _check_inputs that copies during fit and transform. What you think about adding copy parameter to _check_inputs and setting it to false during fit and self.copy during transform?

@thomasjpfan thomasjpfan merged commit 9661a64 into scikit-learn:master Jun 1, 2019
@thomasjpfan
Copy link
Member

Thank you @rth!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants