-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue with torch.max() over dim 2 #1310
Comments
Reproduced on |
It seems that the error is only in TH. cc @adamlerer |
@albanD found that it is not actually limited to Tensor of 4D. import torch
dims = (3, 4, 5, 6, 7, 8)
a = torch.randn(*dims)
for dim in range(len(dims)):
for i in range(a.size(dim)):
a.select(dim,i).fill_(i)
val, argmax = a.max(dim)
print(argmax.max()) gives as output:
It would seem like the problem only happen when the dimension given is 2. The value obtained as the argmax will in that case always be the product of the size of the remaining dimensions. Possible interpretation of what happens: Probable cause: |
fixed in master. thanks for the report. |
…60de87 Summary: Previous import was 7848f1e0414ba3b2e263609d93d46fd60790b2e9 Included changes: - **[6146a85](onnx/onnx@6146a85)**: Check pybind version (pytorch#1315) <Changming Sun> - **[2cbf740](onnx/onnx@2cbf740)**: Domain exists in GraphProto but not in Node (pytorch#1310) <Ryan Hill> - **[9b874e9](onnx/onnx@9b874e9)**: [Title] Add optimization pass eliminating nop Pad (pytorch#1307) <Tingfan Wu> Differential Revision: D9485475 fbshipit-source-id: be00d35f92b00c64b55c9f7798a9c142c70ecd93
…60de87 (#10831) Summary: Pull Request resolved: #10831 Previous import was 7848f1e0414ba3b2e263609d93d46fd60790b2e9 Included changes: - **[6146a85](onnx/onnx@6146a85)**: Check pybind version (#1315) <Changming Sun> - **[2cbf740](onnx/onnx@2cbf740)**: Domain exists in GraphProto but not in Node (#1310) <Ryan Hill> - **[9b874e9](onnx/onnx@9b874e9)**: [Title] Add optimization pass eliminating nop Pad (#1307) <Tingfan Wu> Reviewed By: yinghai Differential Revision: D9485475 fbshipit-source-id: 3adb4e6e182278fd2abe5068a9d4569763e0ff0c
…60de87 (pytorch#10831) Summary: Pull Request resolved: pytorch#10831 Previous import was 7848f1e0414ba3b2e263609d93d46fd60790b2e9 Included changes: - **[6146a85](onnx/onnx@6146a85)**: Check pybind version (pytorch#1315) <Changming Sun> - **[2cbf740](onnx/onnx@2cbf740)**: Domain exists in GraphProto but not in Node (pytorch#1310) <Ryan Hill> - **[9b874e9](onnx/onnx@9b874e9)**: [Title] Add optimization pass eliminating nop Pad (pytorch#1307) <Tingfan Wu> Reviewed By: yinghai Differential Revision: D9485475 fbshipit-source-id: 3adb4e6e182278fd2abe5068a9d4569763e0ff0c
When operating over tensor with 4 dimensions, the index returned by
torch.max()
are wrong and can go beyond the size of the dimension that was reduced over.Surprinsingly, this doesn't happen for other number of dimensions
Code for demonstrating:
Can you confirm this is a bug and not a misunderstanding on my part of what
torch.max()
should return?The text was updated successfully, but these errors were encountered: