Skip to content

Conversation

@ngimel
Copy link
Collaborator

@ngimel ngimel commented Dec 5, 2022

@pytorch-bot
Copy link

pytorch-bot bot commented Dec 5, 2022

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/90235

Note: Links to docs will display an error until the docs builds have been completed.

❌ 2 Failures, 2 Pending

As of commit 7b36aed:

The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@ngimel
Copy link
Collaborator Author

ngimel commented Dec 6, 2022

@pytorchbot merge -f "test failures unrelated"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

kulinseth pushed a commit to kulinseth/pytorch that referenced this pull request Dec 10, 2022
Matmul padding is beneficial not only for fp32, fp16/bf16 with amp can benefit as well.

Pull Request resolved: pytorch#90235
Approved by: https://github.com/jiawenliu64
@github-actions github-actions bot deleted the ngimel/fp16_matmul_padding branch June 6, 2024 01:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants