-
-
Notifications
You must be signed in to change notification settings - Fork 948
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add torch.jit.script
support for warp_affine
#2588
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw, it's pretty awesome that ruff is able to revert the typing from |
to Union and Optional
@@ -11,6 +11,16 @@ | |||
from kornia.utils.helpers import _torch_inverse_cast | |||
|
|||
|
|||
class DummyNNModule(torch.nn.Module): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@johnnv1 i think as post meta task from this we could exploring to prototype an AutoModule that basically given a function parsed via inspect
to automatically generate Modules. This might requires creating InputTensor
/ OutputTensor
alias types.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can work on it
@@ -268,6 +278,35 @@ def test_fill_padding_channels(self, device, dtype, num_channels): | |||
|
|||
assert_close(img_a[:, :, :1, :1].squeeze(), fill_value.squeeze()) | |||
|
|||
@pytest.mark.parametrize("align_corners", (True, False)) | |||
@pytest.mark.parametrize("padding_mode", ("zeros", "fill")) | |||
def test_jit_script(self, device, dtype, align_corners, padding_mode): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it relevant to keep torchscript test is we go for onnx ? underhood it traces in any case
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I consider onnx to be one of the end states of scripting pipeline of torch (tensorrt is another widely used format that comes to mind), so I think it's worth keeping the test for an intermediate torch.jit.script product.
Again, that's coming from me as a user of kornia rather than a person who has insight into the project.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure, it's just that we have had such a base experience with torchscript along the years and prioritised to onnx recently as we seen the most claimed format but always open to community needs.
Co-authored-by: Edgar Riba <edgar.riba@gmail.com>
Let me know if there is anything else needed to get this merged in. I believe all of the remaining threads are outstanding items for the project in general, rather than this PR (ofc, if one of these should be included here, ping me) I appreciate fast responsiveness, and great feedback! Made me feel welcome to the project! |
* Switch "X | None" to Optional[X] * Add tests for scripting/compiling warp_affine * add temp.onnx to gitignore * add asserts to test_jit_script * remove onnx test * Update .gitignore with *.onnx Co-authored-by: Edgar Riba <edgar.riba@gmail.com> --------- Co-authored-by: Edgar Riba <edgar.riba@gmail.com>
Changes
Changes ruff to allow usage of
Optional
instead of| None
Add temp.onnx to .gitignore (since some tests output that)
Add
torch.jit.script
test forwarp_affine
Add
torch.onnx.export
test forwarp_affine
(currently failing due to op not supported)Some other changes that were forced due to use of
pre-commit
. Please let me know if these should not be included, and in such a case if it is ok to not runpre-commit
.Fixes #2583
Type of change
Checklist