Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MRG] Add a random seed for generating X in test_mlp.test_gradient() #13585

Conversation

aditya1702
Copy link
Contributor

@aditya1702 aditya1702 commented Apr 6, 2019

Reference Issues/PRs

Fixes #13581

What does this implement/fix? Explain your changes.

Add a random seed to fix the random values generated for X. This ensures that the test does not fail on CI

n_features = 10
np.random.seed(42)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is usually not the way we do it.
To avoid changing the global random state, we prefer using:

rng = np.random.RandomState(42)
X = rng.random((n_samples, n_features))

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah! Makes sense. Will make the changes ✌️

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@TomDLT Made the changes

Copy link
Member

@TomDLT TomDLT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @aditya1702

@@ -175,9 +175,10 @@ def test_gradient():
# are correct. The numerical and analytical computation of the gradient
# should be close.
for n_labels in [2, 3]:
n_samples = 5
n_samples = 100
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see what this helps. If it worked with 5 and that tests the functionality we seek, why change it?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, My bad, I suggested it to make the test more robust now that the random seed is fixed, but I was misled in thinking that the gradient was taken w.r.t the samples (too much gradient boosting I guess.)

numgrad and grad size is already 121 or 143 so this is not needed. Can you please revert @aditya1702 ?

LGTM otherwise

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@NicolasHug Reverted

@jnothman
Copy link
Member

jnothman commented Apr 6, 2019 via email

@NicolasHug NicolasHug merged commit 3ce6237 into scikit-learn:master Apr 7, 2019
@NicolasHug
Copy link
Member

Thanks @aditya1702 !

@aditya1702 aditya1702 deleted the fix-random-seed-in-test-mlp-gradient branch April 9, 2019 19:55
jeremiedbb pushed a commit to jeremiedbb/scikit-learn that referenced this pull request Apr 25, 2019
xhluca pushed a commit to xhluca/scikit-learn that referenced this pull request Apr 28, 2019
xhluca pushed a commit to xhluca/scikit-learn that referenced this pull request Apr 28, 2019
xhluca pushed a commit to xhluca/scikit-learn that referenced this pull request Apr 28, 2019
koenvandevelde pushed a commit to koenvandevelde/scikit-learn that referenced this pull request Jul 12, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Use fixed random seed in test_mlp.test_gradient
4 participants