-
Notifications
You must be signed in to change notification settings - Fork 25k
Add correction parameter to std/var #50903
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit 94a0fa4 (more details on the Dr. CI page):
🕵️ 3 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
First part of #50010 [ghstack-poisoned]
First part of #50010 [ghstack-poisoned]
First part of #50010 [ghstack-poisoned]
First part of #50010 [ghstack-poisoned]
First part of #50010 [ghstack-poisoned]
First part of #50010 [ghstack-poisoned]
Looks like a complex test is failing internally:
Maybe the test is sensitive to different random values, a different version of NumPy, or a different compiler? We might need to make the test "easier" or simplify it. The complex double version of the test is failing with the same error. |
First part of #50010. Also fixes #51127. Differential Revision: [D27911345](https://our.internmc.facebook.com/intern/diff/D27911345) [ghstack-poisoned]
@mruberry the test is sensitive to machine precision, but only in the event the variance is zero (since exactly zero gives nan, while finite but small gives inf). However, zero variance seems very unlikely on random data so that's still a bit surprising and might be worth looking into. For now, I've changed the test to treat nan and inf as equal. |
Unfortunately the same tests are failing:
If it's a pain to sort out what's going I could try to get someone with access to the internal test harness to take a look? |
First part of #50010. Also fixes #51127. Differential Revision: [D27911345](https://our.internmc.facebook.com/intern/diff/D27911345) [ghstack-poisoned]
@mruberry I've loosened the restrictions on the imaginary component this time. Allowing to to be a non-finite value if the real component also isn't finite. If that doesn't work, I really need to know what arguments to |
Sounds good; let's see what the internal tests say. |
Good news! Internal tests are now passing. I'll get the land process started tomorrow morning and ping @JackCaoG with updates |
Darn, looks like the PR CI failures for fx are real: test_normalize_operator_exhaustive_std_cpu_float32 Probably just a tweak to that test is needed? |
First part of #50010. Also fixes #51127. Differential Revision: [D27911345](https://our.internmc.facebook.com/intern/diff/D27911345) [ghstack-poisoned]
First part of #50010. Also fixes #51127. Differential Revision: [D27911345](https://our.internmc.facebook.com/intern/diff/D27911345) [ghstack-poisoned]
@mruberry since the fx PR got reverted, I've removed it from this stack and instead just added a skip in the test for |
First part of #50010. Also fixes #51127. Differential Revision: [D27911345](https://our.internmc.facebook.com/intern/diff/D27911345) [ghstack-poisoned]
ghstack-source-id: 49aef26 Pull Request resolved: pytorch#50903
We'll have to try again on Monday, sorry @peterbell10. Some landing issues let this accumulate another merge conflict. Let's rebase it Monday morning and I'll start the land process while no one's committing. |
ghstack-source-id: 49aef26 Pull Request resolved: pytorch#50903
First part of #50010. Also fixes #51127. Differential Revision: [D27911345](https://our.internmc.facebook.com/intern/diff/D27911345) [ghstack-poisoned]
@mruberry I've rebased on viable/strict and fixed merge conflicts. |
Internal tests look good; unfortunately I lost power this morning, but I'll try to coordinate this with @JackCaoG tomorrow morning |
Summary: Pull Request resolved: pytorch#50903 First part of pytorch#50010. Also fixes pytorch#51127. Test Plan: Imported from OSS Reviewed By: ngimel Differential Revision: D27911345 Pulled By: mruberry fbshipit-source-id: 7138fddc935802918ab9ff19f4bc1b9f4d745d41
ghstack-source-id: 98ac0e1 Pull Request resolved: pytorch/pytorch#50903
Stack from ghstack:
correction
value andunbiased
argument #55679 std/var: Deprecate default correction valueFirst part of #50010. Also fixes #51127.
Added overloads for torch.{std, var, std_mean, var_mean} with a correction argument specifying the difference between the sample size and number of degrees of freedom. e.g
correction=1
is equivalent to Bessel's correction, which can also be gained using theunbiased=True
overload.Differential Revision: D27911345