Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PERF Ravel each array only once in BaseMultilayerPerceptron._backprop #17606

merged 1 commit into from Jun 17, 2020


Copy link

This change decreases neural net training time by about 1%.

Test program:

from sklearn.datasets import load_digits
from sklearn.neural_network import MLPClassifier
from neurtu import delayed, Benchmark

digits = load_digits(return_X_y=True)
X = digits[0][:,:10]
y = digits[0][:,11]

clf = MLPClassifier(solver='lbfgs', alpha=1e-5,
                    hidden_layer_sizes=(5, 2), random_state=1, max_iter=1000)

train = delayed(clf).fit(X, y)
print(Benchmark(wall_time=True, cpu_time=True, repeat=10)(train))


      wall_time  cpu_time                                                                                                     
mean   2.060248  2.052360
max    2.069880  2.054955
std    0.004758  0.001587


      wall_time  cpu_time                                                                                                     
mean   2.042922  2.038958
max    2.045680  2.053701
std    0.002014  0.005943

Copy link

@rth rth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @alexhenrie ! LGTM.

@rth rth merged commit f685547 into scikit-learn:master Jun 17, 2020
Copy link
Contributor Author

Thank you!

@alexhenrie alexhenrie deleted the ravel branch June 17, 2020 20:30
rubywerman pushed a commit to MLH-Fellowship/scikit-learn that referenced this pull request Jun 24, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet

Successfully merging this pull request may close these issues.

None yet

2 participants