Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix error in Binomial to retain lazy logit initialization #46055

Closed
wants to merge 2 commits into from

Conversation

neerajprad
Copy link
Contributor

Some internal tests were sporadically failing for #45648. The cause of this is a bug in Binomial.__init__ that references the lazy logits attribute and sets it when not needed. This cleans up the is_scalar logic too which isn't needed given that broadcast_all will convert Number to a tensor.

The reason for the flakiness is the mutation of the params dict by the first test, which is fixed by doing a shallow copy. It will be better to convert this into a pytest parameterized test once #45648 is merged.

cc. @fritzo, @ezyang

@neerajprad neerajprad requested a review from fritzo October 8, 2020 20:55
@neerajprad neerajprad added the module: distributions Related to torch.distributions label Oct 8, 2020
@codecov
Copy link

codecov bot commented Oct 9, 2020

Codecov Report

Merging #46055 into master will decrease coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master   #46055      +/-   ##
==========================================
- Coverage   68.26%   68.26%   -0.01%     
==========================================
  Files         410      410              
  Lines       53328    53323       -5     
==========================================
- Hits        36403    36399       -4     
+ Misses      16925    16924       -1     
Impacted Files Coverage Δ
torch/distributions/binomial.py 95.94% <100.00%> (+1.00%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5f7545a...f792482. Read the comment docs.

@fritzo
Copy link
Collaborator

fritzo commented Oct 9, 2020

@pytorchbot merge this please

@pytorchbot pytorchbot added the merge-this-please Was marked for merge with @pytorchbot merge this please label Oct 9, 2020
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@neerajprad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@neerajprad merged this pull request in 9202c44.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
merge-this-please Was marked for merge with @pytorchbot merge this please Merged module: distributions Related to torch.distributions
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants