Skip to content

Conversation

@vkuzo
Copy link
Contributor

@vkuzo vkuzo commented Oct 13, 2025

Minor simplifications of main README.md:

  1. move 2025H1 news to the section where user has to expand to see
  2. simplify key value prop sentence to "TorchAO is an easy to use quantization library for native PyTorch"
  3. add some missing bash and python modifiers to code blocks
  4. rename "float8" to "quantized training" to match the rest of the document
  5. move "sparse training" and "memory efficient optimizers" to the section where user has to expand to see

@pytorch-bot
Copy link

pytorch-bot bot commented Oct 13, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/3160

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

✅ No Failures

As of commit e400507 with merge base 6c24a7a (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 13, 2025
@vkuzo vkuzo added the topic: documentation Use this tag if this PR adds or improves documentation label Oct 13, 2025

TorchAO is a PyTorch-native model optimization framework leveraging quantization and sparsity to provide an end-to-end, training-to-serving workflow
for AI models. TorchAO works out-of-the-box with `torch.compile()` and `FSDP2` across most HuggingFace PyTorch models. Key features include:
TorchAO is an easy to use quantization library for native PyTorch. TorchAO works out-of-the-box with `torch.compile()` and `FSDP2` across most HuggingFace PyTorch models. Key features include:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks fine to me, wondering if @supriyar have any thoughts on the wording since it's different from the paper now

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's fine, since the new statement does not mean other one isn't true anymore. I think @vkuzo is trying to simplify the message users see in first glance of the repo.

@vkuzo vkuzo merged commit 8482770 into main Oct 13, 2025
18 of 20 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: documentation Use this tag if this PR adds or improves documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants