-
Notifications
You must be signed in to change notification settings - Fork 21.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add flexible bilinear upsampling aspect ratio redux #1317
Add flexible bilinear upsampling aspect ratio redux #1317
Conversation
self.size = size | ||
if scale_factor is not None and not isinstance(scale_factor, (Integral, tuple)): | ||
raise ValueError('scale_factor must be of integer type or tuple of integer types') | ||
self.size = _pair(size) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
# we have to be a tuple at this point | ||
try: | ||
assert len(self.scale_factor) == 2 | ||
for i in self.scale_factor: |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
self.output_size = ( | ||
input.size(2) * self.scale_factor, | ||
input.size(3) * self.scale_factor, | ||
input.size(2) * self.scale_factor[0], |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
@@ -110,5 +115,21 @@ class UpsamplingBilinear2d(_UpsamplingBase): | |||
|
|||
""" | |||
|
|||
def __init__(self, size=None, scale_factor=None): | |||
super(UpsamplingBilinear2d, self).__init__(size, scale_factor) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/modules/upsampling.py
Outdated
if scale_factor is not None and not isinstance(scale_factor, Integral): | ||
raise ValueError('scale_factor must be of integer type') | ||
if scale_factor is not None and not isinstance(scale_factor, (Integral, tuple)): | ||
raise ValueError('scale_factor must be of integer type or tuple of integer types') | ||
self.size = _pair(size) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
||
if self.scale_factor is not None: | ||
self.scale_factor = _pair(self.scale_factor) | ||
# we have to be a tuple at this point |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
Thanks for the comments @apaszke! I think I made all the changes you asked for. Please, let me know if I didn't understand something (in particular I just c/p your |
This allows for the base class to be used for upsampling routines other than 2d. I also renamed _check_bilinear_2d_scale_factor().
thanks Andrew! |
…8ffb52 (pytorch#11346) Summary: Pull Request resolved: pytorch#11346 Previous import was 1b09eb14c2c781fae078fa6b1c0390ba6fc0898c Included changes: - **[bff0b88](onnx/onnx@bff0b88)**: Add DynamicSlice experimental op (pytorch#1377) <James Reed> - **[91a7b8e](onnx/onnx@91a7b8e)**: statCoverage(model) (pytorch#1246) <Akshay Chalana> - **[36643c6](onnx/onnx@36643c6)**: fix the doc for softmax (pytorch#1374) <Lu Fang> - **[8c64acd](onnx/onnx@8c64acd)**: Silence usused result warning in ONNXIFI wrapper cleanup. Fix pytorch#1344 (pytorch#1371) <Marat Dukhan> - **[53b20f6](onnx/onnx@53b20f6)**: Add the ability to deprecate an OpSchema (pytorch#1317) <Ryan Hill> - **[8aec4e2](onnx/onnx@8aec4e2)**: [Anderspapitto patch] fix the shape inference for broadcasting (pytorch#1368) <Lu Fang> Reviewed By: jamesr66a Differential Revision: D9691533 fbshipit-source-id: 1a8c22262ae4946897e4be030d3f1cf3a3ad58b6
…8ffb52 (#11346) Summary: Pull Request resolved: #11346 Previous import was 1b09eb14c2c781fae078fa6b1c0390ba6fc0898c Included changes: - **[bff0b88](onnx/onnx@bff0b88)**: Add DynamicSlice experimental op (#1377) <James Reed> - **[91a7b8e](onnx/onnx@91a7b8e)**: statCoverage(model) (#1246) <Akshay Chalana> - **[36643c6](onnx/onnx@36643c6)**: fix the doc for softmax (#1374) <Lu Fang> - **[8c64acd](onnx/onnx@8c64acd)**: Silence usused result warning in ONNXIFI wrapper cleanup. Fix #1344 (#1371) <Marat Dukhan> - **[53b20f6](onnx/onnx@53b20f6)**: Add the ability to deprecate an OpSchema (#1317) <Ryan Hill> - **[8aec4e2](onnx/onnx@8aec4e2)**: [Anderspapitto patch] fix the shape inference for broadcasting (#1368) <Lu Fang> Reviewed By: jamesr66a Differential Revision: D9691533 fbshipit-source-id: 6aff6ce04ade37182e2ffe9bc83eb86846bc722d
…8ffb52 (pytorch#11346) Summary: Pull Request resolved: pytorch#11346 Previous import was 1b09eb14c2c781fae078fa6b1c0390ba6fc0898c Included changes: - **[bff0b88](onnx/onnx@bff0b88)**: Add DynamicSlice experimental op (pytorch#1377) <James Reed> - **[91a7b8e](onnx/onnx@91a7b8e)**: statCoverage(model) (pytorch#1246) <Akshay Chalana> - **[36643c6](onnx/onnx@36643c6)**: fix the doc for softmax (pytorch#1374) <Lu Fang> - **[8c64acd](onnx/onnx@8c64acd)**: Silence usused result warning in ONNXIFI wrapper cleanup. Fix pytorch#1344 (pytorch#1371) <Marat Dukhan> - **[53b20f6](onnx/onnx@53b20f6)**: Add the ability to deprecate an OpSchema (pytorch#1317) <Ryan Hill> - **[8aec4e2](onnx/onnx@8aec4e2)**: [Anderspapitto patch] fix the shape inference for broadcasting (pytorch#1368) <Lu Fang> Reviewed By: jamesr66a Differential Revision: D9691533 fbshipit-source-id: 6aff6ce04ade37182e2ffe9bc83eb86846bc722d
… and `fused_weight_gradient_mlp_cuda` is missing (pytorch#1317)
This PR addresses issue #1257, which aims to add non-coupled scaling factors for bilinear 2d upsampling. I've added two new tests, and currently everything passes. This is a relatively simple change which only touches python code.
Furthermore, this is is a re-do of PR #1279, wherein I messed up a rebase. There are some in-line comments there that might be worth skimming through, as well, but I tried to address most of the concerns raised there. In particular, I maintained the base class of all the upsampling methods.
Tests pass, on GPU and CPU.
tagging @apaszke @fmassa @soumith. Thanks in advance everyone.
ps: I'll probably need a pointer on how to best rebase to master, eventually ;)