-
-
Notifications
You must be signed in to change notification settings - Fork 25.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MRG] Add a random seed for generating X in test_mlp.test_gradient() #13585
[MRG] Add a random seed for generating X in test_mlp.test_gradient() #13585
Conversation
n_features = 10 | ||
np.random.seed(42) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is usually not the way we do it.
To avoid changing the global random state, we prefer using:
rng = np.random.RandomState(42)
X = rng.random((n_samples, n_features))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah! Makes sense. Will make the changes ✌️
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@TomDLT Made the changes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks @aditya1702
@@ -175,9 +175,10 @@ def test_gradient(): | |||
# are correct. The numerical and analytical computation of the gradient | |||
# should be close. | |||
for n_labels in [2, 3]: | |||
n_samples = 5 | |||
n_samples = 100 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see what this helps. If it worked with 5 and that tests the functionality we seek, why change it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, My bad, I suggested it to make the test more robust now that the random seed is fixed, but I was misled in thinking that the gradient was taken w.r.t the samples (too much gradient boosting I guess.)
numgrad
and grad
size is already 121 or 143 so this is not needed. Can you please revert @aditya1702 ?
LGTM otherwise
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@NicolasHug Reverted
@NicolasHug, the green button is all yours to push!
|
Thanks @aditya1702 ! |
…ent() (scikit-learn#13585)" This reverts commit d2e3ff6.
…ent() (scikit-learn#13585)" This reverts commit d2e3ff6.
Reference Issues/PRs
Fixes #13581
What does this implement/fix? Explain your changes.
Add a random seed to fix the random values generated for X. This ensures that the test does not fail on CI