Skip to content

feat(cpu): add flash attention2 operator support#362

Merged
chenghuaWang merged 1 commit intoUbiquitousLearning:v2from
chenghuaWang:v2
Aug 8, 2025
Merged

feat(cpu): add flash attention2 operator support#362
chenghuaWang merged 1 commit intoUbiquitousLearning:v2from
chenghuaWang:v2

Conversation

@chenghuaWang
Copy link
Copy Markdown
Collaborator

  • Implement CPUFlashAttention2Op for float16 data type
  • Add support for flash attention2 in CPU backend
  • Include necessary kernel configurations and implementations
  • Update functional interface for flash attention2
  • Add unit tests for flash attention2 operator

- Implement CPUFlashAttention2Op for float16 data type
- Add support for flash attention2 in CPU backend
- Include necessary kernel configurations and implementations
- Update functional interface for flash attention2
- Add unit tests for flash attention2 operator
@chenghuaWang chenghuaWang merged commit 8c128f9 into UbiquitousLearning:v2 Aug 8, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant