Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nicer logs for dynamic shapes #99277

Closed

Conversation

voznesenskym
Copy link
Collaborator

@voznesenskym voznesenskym commented Apr 16, 2023

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Apr 16, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/99277

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

✅ No Failures

As of commit 1ecf54e:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: fx release notes category label Apr 16, 2023
voznesenskym added a commit that referenced this pull request Apr 16, 2023
ghstack-source-id: 1f54111b681f46459d96358c50aa7460b1ff8d45
Pull Request resolved: #99277
@@ -1176,18 +1180,34 @@ def wrap_to_fake_tensor_and_record(
# If there is no entry for this source, add the tensor to frame state with its current static size.
# E.g., {} -> {“x”: [2, 4]}
curr_sizes = list(e.size())
log.debug("Registered static shapes %s for %s", curr_sizes, name)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one gets very noisy, so maybe we keep him debug, and make the others info? Open to bikshedding. @ezyang opine please.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is hard to say without actually using the logs. Keeping everything as debug to start is a safe start, assuming this doesn't trigger too much (but I don't think it does, since we debug trace every opcode in dynamo). Note that you can selectively up logging from ONLY this module with TORCH_LOGS=+torch._dynamo.variables.builder

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would bikeshed the message here though; "registered" isn't very descriptive. In the log message, I think I probably want the words frame_state in it. Maybe frame_state[name] = curr_sizes. But in that case, why don't we move the log message down to where we actually do the setter, and then delete all the other branches?

cc soumith penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx desertfire

[ghstack-poisoned]
voznesenskym added a commit that referenced this pull request Apr 16, 2023
ghstack-source-id: 48665ec78aa53fcb7434b2f92b0625baf7fb5ab3
Pull Request resolved: #99277

Better little none check
@@ -1945,10 +1945,10 @@ def hint():
msg = f" {len(error_msgs) + 1}. {msg()}"
error_msgs.append(msg)
if len(error_msgs) > 0:
log.warning("Warning only constraints violated %s", warn_msgs)
log.info("Warning only constraints violated %s", warn_msgs)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wait, this looks like a botched merge, we shouldn't warn here at all, we're going to raise an error!

raise ConstraintViolationError(f"Constraints violated!\n{error_msgs}")
elif len(warn_msgs) > 0:
log.warning("%s Warning only constraints violated", len(warn_msgs))
log.info("%s Warning only constraints violated", len(warn_msgs))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This probably suppresses too much but we can figure out a better strategy later (maybe only warn this once per frame)

@albanD albanD removed their request for review April 17, 2023 17:42
@voznesenskym
Copy link
Collaborator Author

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a rebase job. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Rebase failed due to Command git -C /home/runner/work/pytorch/pytorch rebase refs/remotes/origin/viable/strict gh/voznesenskym/102/orig returned non-zero exit code 1

warning: skipped previously applied commit cf387a5b65c
warning: skipped previously applied commit fb358849425
hint: use --reapply-cherry-picks to include skipped commits
hint: Disable this message with "git config advice.skippedCherryPicks false"
Rebasing (1/1)
Auto-merging torch/_dynamo/variables/builder.py
CONFLICT (content): Merge conflict in torch/_dynamo/variables/builder.py
Auto-merging torch/fx/experimental/symbolic_shapes.py
CONFLICT (content): Merge conflict in torch/fx/experimental/symbolic_shapes.py
error: could not apply 62d51eea02a... Nicer logs for dynamic shapes
hint: Resolve all conflicts manually, mark them as resolved with
hint: "git add/rm <conflicted_files>", then run "git rebase --continue".
hint: You can instead skip this commit: run "git rebase --skip".
hint: To abort and get back to the state before "git rebase", run "git rebase --abort".
Could not apply 62d51eea02a... Nicer logs for dynamic shapes

Raised by https://github.com/pytorch/pytorch/actions/runs/4783046149

cc soumith penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx desertfire

[ghstack-poisoned]
voznesenskym added a commit that referenced this pull request Apr 24, 2023
ghstack-source-id: e62c51a2ad28a568699180f57613c165bdea7820
Pull Request resolved: #99277

Better little none check
@voznesenskym
Copy link
Collaborator Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Apr 24, 2023
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@facebook-github-bot facebook-github-bot deleted the gh/voznesenskym/102/head branch June 8, 2023 18:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/inductor ciflow/trunk Trigger trunk jobs on your pull request Merged merging module: dynamo release notes: fx release notes category
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants