XTuner Release V0.1.16
What's Changed
- set dev version by @LZHgrla in #487
- Fix type error when the visual encoder is not CLIP by @hhaAndroid in #496
- [Feature] Support Sequence parallel by @HIT-cwh in #456
- [Bug] Fix bugs in flash_attn1_pytorch by @HIT-cwh in #513
- [Fix] delete cat in varlen attn by @HIT-cwh in #508
- bump version to 0.1.16 by @HIT-cwh in #520
- [Improve] Add
generation_kwargs
forEvaluateChatHook
by @LZHgrla in #501 - [Bugs] Fix bugs when training in non-distributed env by @HIT-cwh in #522
- [Fix] Support transformers>=4.38 and require transformers>=4.36.0 by @HIT-cwh in #494
- [Fix] Fix throughput hook by @HIT-cwh in #527
- Update README.md by @JianxinDong in #528
- [Fix] dispatch internlm rote by @HIT-cwh in #530
- Limit transformers != 4.38 by @HIT-cwh in #531
New Contributors
- @hhaAndroid made their first contribution in #496
- @JianxinDong made their first contribution in #528
Full Changelog: v0.1.15...v0.1.16