Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH Adds test and docs for modules that already support no batch dims #62729

Closed
wants to merge 12 commits into from

Conversation

thomasjpfan
Copy link
Contributor

@thomasjpfan thomasjpfan commented Aug 4, 2021

@thomasjpfan thomasjpfan added the module: nn Related to torch.nn label Aug 4, 2021
@thomasjpfan thomasjpfan added this to In Progress in torch.nn via automation Aug 4, 2021
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Aug 4, 2021

🔗 Helpful links

💊 CI failures summary and remediations

As of commit df99ab6 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@H-Huang H-Huang added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Aug 5, 2021
@albanD albanD removed their request for review August 5, 2021 14:11
Copy link
Contributor

@jbschlosser jbschlosser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking great! Couple missing things but solid otherwise

torch/testing/_internal/common_nn.py Show resolved Hide resolved
@jbschlosser
Copy link
Contributor

@thomasjpfan FYI checks still failing, last commit looks empty (something might have gone wrong with git)

@jbschlosser
Copy link
Contributor

jbschlosser commented Aug 24, 2021

@thomasjpfan Based on the below failing check, it looks like the C++ API possibly needs to be updated:

======================================================================
ERROR [0.544s]: test_torch_nn_Softmax2d_no_batch_dim (__main__.TestCppApiParity)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\actions-runner\_work\pytorch\pytorch\pytorch-1128036586\test\cpp_api_parity\module_impl_check.py", line 251, in test_fn
    test_forward_backward(
  File "C:\actions-runner\_work\pytorch\pytorch\pytorch-1128036586\test\cpp_api_parity\module_impl_check.py", line 181, in test_forward_backward
    run_cpp_test_fn_and_check_output()
  File "C:\actions-runner\_work\pytorch\pytorch\pytorch-1128036586\test\cpp_api_parity\module_impl_check.py", line 149, in run_cpp_test_fn_and_check_output
    cpp_test_fn(arg_dict_file_path, module_file_path, forward_output_file_path, backward_grad_dict_file_path)
RuntimeError: Softmax2d requires a 4D tensor as input

Copy link
Contributor

@jbschlosser jbschlosser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@facebook-github-bot
Copy link
Contributor

@jbschlosser has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@@ -170,8 +170,8 @@ void Softmax2dImpl::pretty_print(std::ostream& stream) const {
}

Tensor Softmax2dImpl::forward(const Tensor& input) {
TORCH_CHECK(input.dim() == 4, "Softmax2d requires a 4D tensor as input");
return F::detail::softmax(input, /*dim=*/1, c10::nullopt);
TORCH_CHECK(input.dim() == 4 or input.dim() == 3, "Softmax2d requires a 3D or 4D tensor as input");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@thomasjpfan FYI we're getting a bunch of internal failures on Windows due to this usage of "or" instead of "||" that slipped in here :p

@codecov
Copy link

codecov bot commented Sep 1, 2021

Codecov Report

Merging #62729 (df99ab6) into master (92b31b5) will decrease coverage by 0.00%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master   #62729      +/-   ##
==========================================
- Coverage   66.63%   66.63%   -0.01%     
==========================================
  Files         705      705              
  Lines       92192    92192              
==========================================
- Hits        61434    61433       -1     
- Misses      30758    30759       +1     

@facebook-github-bot
Copy link
Contributor

@jbschlosser has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

torch.nn automation moved this from In Progress to Done Sep 2, 2021
@facebook-github-bot
Copy link
Contributor

@jbschlosser merged this pull request in 7d01053.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged module: nn Related to torch.nn open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
torch.nn
  
Done
Development

Successfully merging this pull request may close these issues.

None yet

5 participants