Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add dtype information for input #55358

Closed
wants to merge 13 commits into from
Closed

Conversation

guotuofeng
Copy link
Contributor

add dtype for all input besides input dimenstion.

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Apr 6, 2021

💊 CI failures summary and remediations

As of commit 83af5a5 (more details on the Dr. CI page):


  • 2/2 failures introduced in this PR

2 failures not recognized by patterns:

Job Step Action
CircleCI pytorch_windows_vs2019_py36_cuda10.1_test2 Test 🔁 rerun
CircleCI pytorch_windows_vs2019_py36_cuda10.1_test1 Test 🔁 rerun

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

@codecov
Copy link

codecov bot commented Apr 6, 2021

Codecov Report

Merging #55358 (9d087ac) into master (4575028) will increase coverage by 35.06%.
The diff coverage is 92.59%.

❗ Current head 9d087ac differs from pull request most recent head 83af5a5. Consider uploading reports for the commit 83af5a5 to get more accurate results

@@             Coverage Diff             @@
##           master   #55358       +/-   ##
===========================================
+ Coverage   41.96%   77.03%   +35.06%     
===========================================
  Files         581     1924     +1343     
  Lines       73430   190616   +117186     
===========================================
+ Hits        30813   146834   +116021     
- Misses      42617    43782     +1165     

@ngimel ngimel removed the request for review from BowenBao April 12, 2021 23:52
@ngimel ngimel added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Apr 12, 2021
torch/csrc/autograd/profiler_legacy.cpp Outdated Show resolved Hide resolved
@ilia-cher
Copy link
Contributor

Thank you, looks good, modulo one minor comment. Also there might be some (trivial) merge conflict, so please rebase and we'll continue with landing.

@facebook-github-bot
Copy link
Contributor

@ilia-cher has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

types.reserve(fn.inputs().size());
for (const c10::IValue& input : fn.inputs()) {
if (!input.isTensor()) {
types.emplace_back();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

input.tagKind() ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, do we care about list types?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From the time being, we only need care about scalar and tensor. What kind of scenarios that do we need care about list?

@facebook-github-bot
Copy link
Contributor

@ilia-cher has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@ilia-cher merged this pull request in 28f5264.

krshrimali pushed a commit to krshrimali/pytorch that referenced this pull request May 19, 2021
Summary:
add dtype for all input besides input dimenstion.

Pull Request resolved: pytorch#55358

Reviewed By: heitorschueroff

Differential Revision: D27862346

Pulled By: ilia-cher

fbshipit-source-id: 656c5d6c9f23d723b27b44f0afc1a249ce1f3e44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants