-
Notifications
You must be signed in to change notification settings - Fork 25.2k
Nvfuser opt in for decomposition #81134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nvfuser opt in for decomposition #81134
Conversation
🔗 Helpful links
✅ No Failures (0 Pending)As of commit 1338e4f (more details on the Dr. CI page): Expand to see more💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
@pytorchbot rebase |
@pytorchbot successfully started a rebase job. Check the current status here |
Successfully rebased |
0481e83
to
022bfb4
Compare
@jjsjann123 do you know if you have permissions to push branches to the pytorch/pytorch repo? I believe this is the requirement to be able to run torchbench CI |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good to me!
Opened a PR to run torchbench (#81292) just to see if it affects any other models, but I'm guessing that the microbenchmarks won't be affected because they are collected after the decomposition...
Also, it might be worth documenting these in the readme?
I think I have access to push to pytorch repo directly. I'll try that next time~~ |
XLA failure seems unrelated. I'm merging this one. |
@pytorchbot merge |
@pytorchbot successfully started a merge job. Check the current status here |
Merge failed due to Refusing to merge as mandatory check(s) pull failed for rule superuser |
@jjsjann123 fyi, the failing xla test is flaky so if it fails again we can just re-run the test and hope it passes. |
@pytorchbot merge |
@pytorchbot successfully started a merge job. Check the current status here |
Hey @jjsjann123. |
Summary: Regarding issues reported in #79246, we notice that bias decomposition from conv/linear could actually hurt perf, due to the overhead of compilation. This PR changes it to make decomposition an explicit opt-in from user to avoid these regressions. Pull Request resolved: #81134 Approved by: https://github.com/davidberard98 Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/d3acbc821e4a3a29bf252f990a817b2103658a4c Reviewed By: DanilBaibak Differential Revision: D37813470 Pulled By: DanilBaibak fbshipit-source-id: c6f27699d868e92e1e31232b5d7c94d4762530e6
Regarding issues reported in #79246, we notice that bias decomposition from conv/linear could actually hurt perf, due to the overhead of compilation. This PR changes it to make decomposition an explicit opt-in from user to avoid these regressions. Pull Request resolved: pytorch/pytorch#81134 Approved by: https://github.com/davidberard98
Regarding issues reported in #79246, we notice that bias decomposition from conv/linear could actually hurt perf, due to the overhead of compilation. This PR changes it to make decomposition an explicit opt-in from user to avoid these regressions. Pull Request resolved: pytorch/pytorch#81134 Approved by: https://github.com/davidberard98
Regarding issues reported in #79246, we notice that bias decomposition from conv/linear could actually hurt perf, due to the overhead of compilation. This PR changes it to make decomposition an explicit opt-in from user to avoid these regressions.