Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ONNX] Track and list model params for scripting #47348

Closed
wants to merge 30 commits into from

Conversation

neginraoof
Copy link
Contributor

List model parameters as inputs following freezing script module.

@facebook-github-bot facebook-github-bot added cla signed oncall: jit Add this issue/PR to JIT oncall triage queue labels Nov 4, 2020
@dr-ci
Copy link

dr-ci bot commented Nov 4, 2020

💊 CI failures summary and remediations

As of commit 90309a6 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 162 times.

@ejguan ejguan added module: onnx Related to torch.onnx triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Nov 4, 2020
for (Block* sub_block : n->blocks()) {
blocks.emplace_back(sub_block);
}
if (n->kind() == prim::SetAttr) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is difference between prim::SetAttr and prim::GetAttr types?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SetAttr writes to an attribute.

torch/csrc/jit/passes/onnx/list_model_parameters.cpp Outdated Show resolved Hide resolved
@codecov
Copy link

codecov bot commented Nov 16, 2020

Codecov Report

Merging #47348 (90309a6) into master (befab0d) will increase coverage by 6.45%.
The diff coverage is 94.91%.

@@            Coverage Diff             @@
##           master   #47348      +/-   ##
==========================================
+ Coverage   74.28%   80.73%   +6.45%     
==========================================
  Files        1858     1863       +5     
  Lines      200305   200529     +224     
==========================================
+ Hits       148787   161889   +13102     
+ Misses      51518    38640   -12878     

…oof/scripting_params

# Please enter a commit message to explain why this merge is necessary,
# especially if it merges an updated upstream into a topic branch.
#
# Lines starting with '#' will be ignored, and an empty message aborts
# the commit.

std::string fullName("self_");
for (auto& name : moduleNames) {
fullName += name + '_';
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please ensure parameter names are the same as named_parameters(). Can we add tests in test_utility?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So named_parameters() and named_buffers() do not have the "self." prefix as part of the name.
So param name can be: %self.conv.weight for a named_parameter conv.weight.
Do you think to remove "self." from param names?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, as long as they match with named_parameters() and named_buffers().

torch/csrc/jit/passes/onnx/list_model_parameters.cpp Outdated Show resolved Hide resolved
…oof/scripting_params

# Please enter a commit message to explain why this merge is necessary,
# especially if it merges an updated upstream into a topic branch.
#
# Lines starting with '#' will be ignored, and an empty message aborts
# the commit.
Copy link
Collaborator

@BowenBao BowenBao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, please ensure that the parameter names match with named_parameters() and named_buffers(), preferably add a test to cover it.

@neginraoof neginraoof force-pushed the neraoof/scripting_params branch 2 times, most recently from 8fb4ae8 to 65ecd22 Compare December 2, 2020 01:51
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bzinodev has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bzinodev has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@bzinodev merged this pull request in 15bc21c.

@agolynski
Copy link
Contributor

agolynski commented Dec 7, 2020

@neginraoof This PR could be causing failures
"Unexpected keyword argument "preserveParameters" for "freeze_module"

Is it possible that you forgot to update torch/_C/init.pyi.in to include that param in _freeze_module interface?

@agolynski
Copy link
Contributor

Apologies, this is not fault of this PR, but the issue is with #48782 which added outdated mypy annotations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged module: onnx Related to torch.onnx oncall: jit Add this issue/PR to JIT oncall triage queue open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

8 participants