Skip to content

[PyTorch] Fix pad_common for float pad_value#12134

Merged
masahi merged 4 commits intoapache:mainfrom
shingjan:torch_pad_fix
Aug 12, 2022
Merged

[PyTorch] Fix pad_common for float pad_value#12134
masahi merged 4 commits intoapache:mainfrom
shingjan:torch_pad_fix

Conversation

@shingjan
Copy link

This PR intends to fix the impl of pad_common for float pad_value found in models like pytorch_unet and timm_efficientdet, as well as add a couple new tests for the torch.nn.functional.pad.

@shingjan
Copy link
Author

shingjan commented Aug 9, 2022

@vinx13 @masahi This PR should be ready for another look. Thanks!

@shingjan
Copy link
Author

shingjan commented Aug 9, 2022

@tvm-bot rerun

@masahi masahi merged commit 22dcf44 into apache:main Aug 12, 2022
@shingjan shingjan deleted the torch_pad_fix branch August 12, 2022 17:38
xinetzone pushed a commit to daobook/tvm that referenced this pull request Nov 25, 2022
* fix pad

* fix constant padding and handle float infinity

* revert change to pad_width

* fix constant pad value
mikeseven pushed a commit to mikeseven/tvm that referenced this pull request Sep 27, 2023
* fix pad

* fix constant padding and handle float infinity

* revert change to pad_width

* fix constant pad value
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants