-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: torch.distributions.mixture_same_distribution._pad_mixture_dimension #73792
Comments
Would you extend this issue with a small code snippet that demonstrates the bug and shows the before/after with the proposed fix? Yep, just open a pull request for your branch to be merged with master and we'll be able to review it. |
Example given: The below snippet, under the current version, will throw "RuntimeError: only tensors with up to 64 dims are supported", because the bug is trying to create a 1000-dimension tensor. It correctly prints the mean of the gaussian mixture (100 zeros) under the fix.
|
Fixes Issue #73792 This is a duplicate of pull request. #73864. It's a small bugfix that should have happened a long time ago, but it didn't because I didn't actually follow up with the pull request after originally submitting. That's my bad. Trying to remedy the error. This contains a fix to _pad_mixture_dimension, which intends to count the number of dimensions in its referent tensors, but accidentally counts the number of elements (and can thus end up creating tensors with potentially thousands of dimensions by mistake). Also contains a single test for the fixed behavior. Co-authored-by: Jeffrey Wan <soulitzer@gmail.com> Pull Request resolved: #118947 Approved by: https://github.com/soulitzer
Fixes Issue #73792 This is a duplicate of pull request. #73864. It's a small bugfix that should have happened a long time ago, but it didn't because I didn't actually follow up with the pull request after originally submitting. That's my bad. Trying to remedy the error. This contains a fix to _pad_mixture_dimension, which intends to count the number of dimensions in its referent tensors, but accidentally counts the number of elements (and can thus end up creating tensors with potentially thousands of dimensions by mistake). Also contains a single test for the fixed behavior. Co-authored-by: Jeffrey Wan <soulitzer@gmail.com> Pull Request resolved: #118947 Approved by: https://github.com/soulitzer
Fixes Issue pytorch#73792 This is a duplicate of pull request. pytorch#73864. It's a small bugfix that should have happened a long time ago, but it didn't because I didn't actually follow up with the pull request after originally submitting. That's my bad. Trying to remedy the error. This contains a fix to _pad_mixture_dimension, which intends to count the number of dimensions in its referent tensors, but accidentally counts the number of elements (and can thus end up creating tensors with potentially thousands of dimensions by mistake). Also contains a single test for the fixed behavior. Co-authored-by: Jeffrey Wan <soulitzer@gmail.com> Pull Request resolved: pytorch#118947 Approved by: https://github.com/soulitzer
Fixes Issue #73792 This is a duplicate of pull request. #73864. It's a small bugfix that should have happened a long time ago, but it didn't because I didn't actually follow up with the pull request after originally submitting. That's my bad. Trying to remedy the error. This contains a fix to _pad_mixture_dimension, which intends to count the number of dimensions in its referent tensors, but accidentally counts the number of elements (and can thus end up creating tensors with potentially thousands of dimensions by mistake). Also contains a single test for the fixed behavior. Co-authored-by: Jeffrey Wan <soulitzer@gmail.com> Pull Request resolved: #118947 Approved by: https://github.com/soulitzer
馃悰 Describe the bug
I already tracked down what's going on with this one; there is a bug in mixture_same_distribution._pad_mixture_dimension. It uses numel() on Size objects in order to get the number of dimensions in its tensors--however, Size.numel() multiplies all the elements of the size tensor together (it's telling you how many numel are in the tensor whose size it represents!), thus returning what is clearly the wrong value.
This (which is in both 1.8, which I was on originally, and master):
Should be this:
I didn't see this error anywhere else, but I haven't really done a thorough search.
I don't know what branch it's right to make a pull request on (master?) so I thought I'd report it here first.
Versions
Versions of relevant libraries:
[pip3] numpy==1.20.3
[pip3] torch==1.8.0
[pip3] torchvision==0.9.0
[conda] blas 2.109 mkl conda-forge
[conda] blas-devel 3.9.0 9_mkl conda-forge
[conda] cudatoolkit 10.2.89 h8f6ccaa_8 conda-forge
[conda] ffmpeg 4.3 hf484d3e_0 pytorch
[conda] libblas 3.9.0 9_mkl conda-forge
[conda] libcblas 3.9.0 9_mkl conda-forge
[conda] liblapack 3.9.0 9_mkl conda-forge
[conda] liblapacke 3.9.0 9_mkl conda-forge
[conda] mkl 2021.2.0 h726a3e6_389 conda-forge
[conda] mkl-devel 2021.2.0 ha770c72_390 conda-forge
[conda] mkl-include 2021.2.0 h726a3e6_389 conda-forge
[conda] numpy 1.20.3 py38h9894fe3_1 conda-forge
[conda] pytorch 1.8.0 py3.8_cuda10.2_cudnn7.6.5_0 pytorch
[conda] torchvision 0.9.0 py38_cu102 pytorch
cc @ezyang @gchanan @zou3519 @fritzo @neerajprad @alicanb @nikitaved
The text was updated successfully, but these errors were encountered: