Skip to content

Conversation

@pzhan9
Copy link
Contributor

@pzhan9 pzhan9 commented Oct 27, 2025

Summary:
We found this log not helpful when debugging S576170. But it causes log spew and makes it hard to find other useful logs.

Downgrade to TRACE instead of deleting it just in case we might need it in the future debugging (at that time we can manually bump its level).

Reviewed By: shayne-fletcher

Differential Revision: D85533659

Summary:
We found this log not helpful when debugging S576170. But it causes log spew and makes it hard to find other useful logs.

Downgrade to TRACE instead of deleting it just in case we might need it in the future debugging (at that time we can manually bump its level).

Reviewed By: shayne-fletcher

Differential Revision: D85533659
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Oct 27, 2025
@meta-codesync
Copy link

meta-codesync bot commented Oct 27, 2025

@pzhan9 has exported this pull request. If you are a Meta employee, you can view the originating Diff in D85533659.

@meta-codesync
Copy link

meta-codesync bot commented Oct 27, 2025

This pull request has been merged in 55fce73.

AlirezaShamsoshoara pushed a commit to AlirezaShamsoshoara/monarch that referenced this pull request Oct 30, 2025
Summary:
Pull Request resolved: meta-pytorch#1672

We found this log not helpful when debugging S576170. But it causes log spew and makes it hard to find other useful logs.

Downgrade to TRACE instead of deleting it just in case we might need it in the future debugging (at that time we can manually bump its level).

Reviewed By: shayne-fletcher

Differential Revision: D85533659

fbshipit-source-id: 1b07acd5dbee9245fc305601b8366eed96998492
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot. fb-exported Merged meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants