New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add type annotations for a few torch.nn.modules #46013
Conversation
torch/nn/modules/padding.py
Outdated
@@ -18,7 +18,7 @@ def __init__(self, value: float) -> None: | |||
self.value = value | |||
|
|||
def forward(self, input: Tensor) -> Tensor: | |||
return F.pad(input, self.padding, 'constant', self.value) | |||
return F.pad(input, self.padding, 'constant', self.value) # type: ignore[arg-type] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is the ignore
necessary here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mypy infers the type of self.padding
as Union[Tensor, Module]
but F.pad
expects padding as a Sequence[int]
.
torch/nn/modules/padding.py:21: error: Argument 2 to "pad" has incompatible type "Union[Tensor, Module]"; expected "Sequence[int]" [arg-type]
Replacing # type: ignore[...]
by an assertion also crashes pytorch. I think I will defer annotating this module for now since it requires more work
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like some TorchScript tests are failing. TorchScript makes use of the type annotation information
💊 CI failures summary and remediationsAs of commit 1d07a2a (more details on the Dr. CI page):
🕵️ 4 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages: binary_windows_libtorch_3_7_cpu_debug_build (1/4)Step: "Build" (full log | diagnosis details | 🔁 rerun)
|
Hi @zou3519, thanks for the review. I've marked the PR as draft and will work/experiment a bit more |
Codecov Report
@@ Coverage Diff @@
## master #46013 +/- ##
===========================================
+ Coverage 35.26% 68.92% +33.65%
===========================================
Files 443 434 -9
Lines 56588 56063 -525
===========================================
+ Hits 19958 38640 +18682
+ Misses 36630 17423 -19207 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @guilhermeleobas, this LGTM modulo adding a comment for the tricky subclassing issue.
Thanks for the review, @rgommers |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM now, and CI failures are all unrelated (connectivity/merge issues).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Fixes #46012