New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test failure of enum-parallel gradients after PyTorch #5776 #912
Labels
Comments
fritzo
changed the title
Elbo gradient mismatch with enum-parallel on latest pytorch master
Test failure of enum-parallel gradients after PyTorch #5776
Mar 21, 2018
Hey @neerajprad, thank you for your post, I'm looking into this now. |
I found the bug, I'll send a patch soon. |
Thanks, @cpuhrsch! Curious to see where the bug was. |
4 tasks
@neerajprad please see PR pytorch/pytorch#5926 |
Fixed upstream by pytorch/pytorch#5926 and in Pyro by #917. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
With the latest pytorch master, many parallel enum tests in
test_enum
are failing due to mismatch in the gradient computation.To replicate, checkout the commit
5fa3aac610ee234338dbc11eb5b6d4a133cb483d
in PyTorch master (pytorch/pytorch#5776), build PyTorch and run these testsExample of a failing test -
test_elbo_iarange_iarange 2-2-None-None-parallel-None
.@fritzo, @eb8680 - I thought that there could be some unexpected interactions between the dice elbo change and upstream PyTorch. Turns out that is not exactly the case as 11 of our tests fail even before the dice elbo change, but there are more failures (79) with dice elbo. Could you guys take a look?
This could either be a Pyro bug or something in PyTorch upstream.
The text was updated successfully, but these errors were encountered: