- 
                Notifications
    You must be signed in to change notification settings 
- Fork 25.7k
[dynamo] Clean up assert in dynamo [3/N] #165903
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| 🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/165903
 Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit 6c31dc3 with merge base 32fe4f6 ( FLAKY - The following job failed but was likely due to flakiness present on trunk:
 
 This comment was automatically generated by Dr. CI and updates every 15 minutes. | 
Extend from #165430 * #165903(Clean up for graph break) * ->#165745 * #165430 One main refractor from the previous PR: * For assertions like checking `len(args)` or `len(kwargs)`, using `raise_args_mismatch` instead of `raise_type_error_exc` I am also considering moving `raise_type_error_exc` into `utils.py` for consistency. Pull Request resolved: #165745 Approved by: https://github.com/Lucaskabela
| @pytorchbot label "topic: not user facing" | 
| @staticmethod | ||
| def create(func, args, kwargs): | ||
| assert func in [ | ||
| if func not in [ | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This assert is fine to leave as-is since we should only be creating this variable if the condition is true from the caller. (code pointer
pytorch/torch/_dynamo/variables/torch.py
Line 398 in 9901d44
| return AutocastModeVariable.create(self.value, args, kwargs) | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the suggestion :) I have updated it.
        
          
                torch/_dynamo/variables/misc.py
              
                Outdated
          
        
      | "queue_callback() is only supported when Compiled Autograd is enabled with fullgraph=True" | ||
| ) | ||
| if not (tx.one_graph or tx.error_on_graph_break): | ||
| unimplemented_v2( | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@xmfan is it ok to graph break here? or should we error out?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems like there has been a change here recently. Let's keep that first.
Co-authored-by: William Wen <william.wen42@gmail.com>
Co-authored-by: William Wen <william.wen42@gmail.com>
Co-authored-by: William Wen <william.wen42@gmail.com>
Co-authored-by: William Wen <william.wen42@gmail.com>
Co-authored-by: William Wen <william.wen42@gmail.com>
| @pytorchbot merge | 
| Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team | 
Some previous PRs have been merged. This PR aims for some assert that the users can trigger, and it may be better to turn them into a graph break. Correct me if there are any problems.
cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @kadeng @chauhang @amjames @Lucaskabela @williamwen42