Skip to content

Conversation

o-u-p
Copy link
Contributor

@o-u-p o-u-p commented Sep 5, 2025

Description of change

  • fix: use flash_attn_type instead of legacy flash_attn

Pull-Request Checklist

  • Code is up-to-date with the master branch
  • npm run format to apply eslint formatting
  • npm run test passes with this change
  • This pull request links relevant issues as Fixes #0000
  • There are new or updated unit tests validating the change
  • Documentation has been updated to reflect this change
  • The new commits and pull request title follow conventions explained in pull request guidelines (PRs that do not follow this convention will not be merged)

@giladgd giladgd changed the title fix: use flash_attn_type instead of legay flash_attn fix: adapt to breaking llama.cpp changes Sep 6, 2025
@giladgd giladgd merged commit 76b505e into withcatai:master Sep 6, 2025
24 checks passed
@giladgd
Copy link
Member

giladgd commented Sep 6, 2025

@o-u-p Thanks for the PR!

Copy link

github-actions bot commented Sep 9, 2025

🎉 This PR is included in version 3.13.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants