-
Notifications
You must be signed in to change notification settings - Fork 25.6k
Bugfix for doing negative padding #161639
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/161639
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 74a3823 with merge base 264e7f6 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@pytorchbot label "topic: bug fixes" |
… with negative padding.
… into bugfix/constant_pad_nd
Hi @manuelcandales |
auto pad_idx = pad.size() - ((i + 1) * 2); | ||
auto new_dim = input_sizes[l_diff + i] + pad[pad_idx] + pad[pad_idx + 1]; | ||
TORCH_CHECK(new_dim > 0, "The input size ", input_sizes[l_diff + i], ", plus negative padding ", | ||
TORCH_CHECK(new_dim >= 0, "The input size ", input_sizes[l_diff + i], ", plus negative padding ", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NIt: this should probably be TORCH_CHECK_VIEW
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see a TORCH_CHECK_VIEW
in the repo... Could you point me to it so that I can see it and make appropriate changes?
@manuelcandales please shepherd this to land, thanks! |
Hi just checking in with this PR. Any help landing this will be much appreciated. |
@skpark-rh can you fix the failing test? The failing test was marked as xfail and we can remove it because of your fix. |
@isuruf Okay I can take a look at the failing test. The new test case, |
@skpark-rh the new test you added has nothing to do with it. The test |
@isuruf Okay that makes sense. I removed the xfail |
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Fixes pytorch#161014 This bug fix introduces a fix that is consistent with the exception handling. Outlined in issue pytorch#161014, there is an edge case where the negative padding does not make the tensor size negative but still triggers the exception that the size is negative. The fix is simply adding `new_dim >=0` to include the zero dim and letting the operator return an empty tensor. In the PR I have added the edge case where the test will now check the negative padding where the dimension gets reduced to zero. But the sample is only for the `constant` type of padding. I would like some feedback if it is necessary to put the same sample on the `reduce` type as well. This is my first PR to contribute to PyTorch and any help/feedback will be welcome! Thank you! @malfet @manuelcandales @janeyx99 @ezyang Pull Request resolved: pytorch#161639 Approved by: https://github.com/manuelcandales
Fixes pytorch#161014 This bug fix introduces a fix that is consistent with the exception handling. Outlined in issue pytorch#161014, there is an edge case where the negative padding does not make the tensor size negative but still triggers the exception that the size is negative. The fix is simply adding `new_dim >=0` to include the zero dim and letting the operator return an empty tensor. In the PR I have added the edge case where the test will now check the negative padding where the dimension gets reduced to zero. But the sample is only for the `constant` type of padding. I would like some feedback if it is necessary to put the same sample on the `reduce` type as well. This is my first PR to contribute to PyTorch and any help/feedback will be welcome! Thank you! @malfet @manuelcandales @janeyx99 @ezyang Pull Request resolved: pytorch#161639 Approved by: https://github.com/manuelcandales
Fixes pytorch#161014 This bug fix introduces a fix that is consistent with the exception handling. Outlined in issue pytorch#161014, there is an edge case where the negative padding does not make the tensor size negative but still triggers the exception that the size is negative. The fix is simply adding `new_dim >=0` to include the zero dim and letting the operator return an empty tensor. In the PR I have added the edge case where the test will now check the negative padding where the dimension gets reduced to zero. But the sample is only for the `constant` type of padding. I would like some feedback if it is necessary to put the same sample on the `reduce` type as well. This is my first PR to contribute to PyTorch and any help/feedback will be welcome! Thank you! @malfet @manuelcandales @janeyx99 @ezyang Pull Request resolved: pytorch#161639 Approved by: https://github.com/manuelcandales
Fixes #161014
This bug fix introduces a fix that is consistent with the exception handling. Outlined in issue #161014, there is an edge case where the negative padding does not make the tensor size negative but still triggers the exception that the size is negative. The fix is simply adding
new_dim >=0
to include the zero dim and letting the operator return an empty tensor.In the PR I have added the edge case where the test will now check the negative padding where the dimension gets reduced to zero. But the sample is only for the
constant
type of padding. I would like some feedback if it is necessary to put the same sample on thereduce
type as well.This is my first PR to contribute to PyTorch and any help/feedback will be welcome! Thank you!
@malfet @manuelcandales @janeyx99 @ezyang
cc @H-Huang @awgu @wanchaol @fegin @fduwjj @wz337 @wconstab @d4l3k @pragupta @ezyang @msaroufim @dcci @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @chenyang78 @kadeng @muchulee8 @amjames @chauhang @aakhundov @coconutruben @Lucaskabela