Skip to content

Conversation

@renovate
Copy link
Contributor

@renovate renovate bot commented Jun 20, 2025

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
accelerate ==1.7.0 -> ==1.8.1 age adoption passing confidence

Release Notes

huggingface/accelerate (accelerate)

v1.8.1: : Patchfix

Compare Source

Full Changelog: huggingface/accelerate@v1.8.0...v1.8.1

v1.8.0: : FSDPv2 + FP8, Regional Compilation for DeepSpeed, Faster Distributed Training on Intel CPUs, ipex.optimize deprecation

Compare Source

FSDPv2 refactor + FP8 support

We've simplified how to prepare FSDPv2 models, as there were too many ways to compose FSDP2 with other features (e.g., FP8, torch.compile, activation checkpointing, etc.). Although the setup is now more restrictive, it leads to fewer errors and a more performant user experience. We’ve also added support for FP8. You can read about the results here. Thanks to @​S1ro1 for this contribution!

Faster Distributed Training on Intel CPUs

We updated the CCL_WORKER_COUNT variable and added KMP parameters for Intel CPU users. This significantly improves distributed training performance (e.g., Tensor Parallelism), with up to a 40% speed-up on Intel 4th Gen Xeon when training transformer TP models.

Regional Compilation for DeepSpeed

We added support for regional compilation with the DeepSpeed engine. DeepSpeed’s .compile() modifies models in-place using torch.nn.Module.compile(...), rather than the out-of-place torch.compile(...), so we had to account for that. Thanks @​IlyasMoutawwakil for this feature!

ipex.optimize deprecation

ipex.optimize is being deprecated. Most optimizations have been upstreamed to PyTorch, and future improvements will land there directly. For users without PyTorch 2.8, we’ll continue to rely on IPEX for now.

Better XPU Support

We've greatly expanded and stabilized support for Intel XPUs:

Trackers

We've added support for SwanLab as an experiment tracking backend. Huge thanks to @​ShaohonChen for this contribution ! We also deferred all tracker initializations to prevent premature setup of distributed environments.

What's Changed
New Contributors

Full Changelog: huggingface/accelerate@v1.7.0...v1.8.0


Configuration

📅 Schedule: Branch creation - "on friday" (UTC), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@renovate renovate bot added the dependencies Pull requests that update a dependency file label Jun 20, 2025
@renovate renovate bot force-pushed the renovate/transformers-dep branch from 38b6bc9 to e93ea1a Compare June 20, 2025 19:32
@renovate renovate bot changed the title chore(deps): update dependency accelerate to v1.8.0 chore(deps): update dependency accelerate to v1.8.1 Jun 20, 2025
@renovate renovate bot force-pushed the renovate/transformers-dep branch from e93ea1a to 081767b Compare June 21, 2025 09:33
@mirpo mirpo merged commit ca4f570 into main Jun 21, 2025
4 checks passed
@mirpo mirpo deleted the renovate/transformers-dep branch June 21, 2025 11:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants