Skip to content

Conversation

pianpwk
Copy link
Contributor

@pianpwk pianpwk commented Jun 3, 2025

Falls back to non-fused linear -> add bias path for non-contiguous tensors with unbacked sizes

Copy link

pytorch-bot bot commented Jun 3, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/155051

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 6eba051 with merge base ce9ba07 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pianpwk pianpwk changed the title [WIP][dynamic shapes] skip fused linear path if not definitely contiguous [dynamic shapes] skip fused linear path if not definitely contiguous Jun 10, 2025
@pianpwk pianpwk marked this pull request as ready for review June 10, 2025 20:55
@pianpwk pianpwk requested review from bobrenjc93 and laithsakka June 10, 2025 20:55
// Also hit the fused path for contiguous 3D input, if not using xla
// backend. Reshaping/flattening has some performance implications on xla.
if (input.is_contiguous() && input_dim == 3) {
if (definitely_contiguous(input.sym_sizes(), input.sym_strides(), input.sym_numel()) && input_dim == 3) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see will have to change it again with my change but fine for now. I will handle it once i rebase.

// Also hit the fused path for contiguous 3D input, if not using xla
// backend. Reshaping/flattening has some performance implications on xla.
if (input.is_contiguous() && input_dim == 3) {
if (definitely_contiguous(input.sym_sizes(), input.sym_strides(), input.sym_numel()) && input_dim == 3) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there is no caching in the current form can you call
definitely_contiguous(input.sym_sizes(), input.sym_strides(), input.sym_numel()) only one time?

Copy link
Contributor

@laithsakka laithsakka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems good as long as you are sure those are not material checks, (just short circuits).
Just make sure you call definitely_contiguous once before you land.

@pianpwk
Copy link
Contributor Author

pianpwk commented Jun 12, 2025

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jun 12, 2025
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

@pianpwk
Copy link
Contributor Author

pianpwk commented Jun 12, 2025

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@github-actions github-actions bot deleted the pianpwk/unbacked_contig_lin branch July 14, 2025 02:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged release notes: export

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants