v1.16.1: Patch release
Breaking change: BetterTransformer llama, falcon, whisper, bart is deprecated
The features from BetterTransformer for Llama, Falcon, Whisper and Bart have been upstreamed in Transformers. Please use transformers>=4.36
and torch>=2.1.1
to use by default PyTorch's scaled_dot_product_attention
.
More details: https://github.com/huggingface/transformers/releases/tag/v4.36.0
What's Changed
- Update dev version by @fxmarty in #1596
- Typo: tansformers -> transformers by @tomaarsen in #1597
- [GPTQ] fix tests by @SunMarc in #1598
- Show correct error message on using BT for SDPA models by @fxmarty in #1599
New Contributors
- @tomaarsen made their first contribution in #1597
Full Changelog: v1.16.0...v1.16.1