New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PERF Use np.maximum instead of np.clip in relu function #17609
Conversation
How does this scale with greater |
|
@@ -72,7 +72,7 @@ def relu(X): | |||
X_new : {array-like, sparse matrix}, shape (n_samples, n_features) | |||
The transformed data. | |||
""" | |||
np.clip(X, 0, np.finfo(X.dtype).max, out=X) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm surprised this wasn't np.clip(X, 0, out=X)
Thanks @alexhenrie |
Thank you! |
Neural net training is about 2% faster without checking in
relu
whether any value is greater than the maximum possible value for the data type (which is always false, so there's no need to check).Test program:
Before:
After: