Skip to content

Conversation

@LoserCheems
Copy link
Collaborator

Restructures documentation to provide complete coverage of all available functions and interfaces including high-level auto-selection, packed variants, and variable length support.

Adds detailed usage examples for standard attention, dynamic masking, grouped-query attention, and variable length sequences with performance optimization tips.

Includes troubleshooting section covering common import errors, performance issues, and debugging techniques with memory monitoring utilities.

Restructures documentation to provide complete coverage of all available functions and interfaces including high-level auto-selection, packed variants, and variable length support.

Adds detailed usage examples for standard attention, dynamic masking, grouped-query attention, and variable length sequences with performance optimization tips.

Includes troubleshooting section covering common import errors, performance issues, and debugging techniques with memory monitoring utilities.

This comment was marked as outdated.

Ensures all code examples include proper device and dtype variable definitions to prevent NameError when users run the sample code.

Also adds missing math import for completeness.
@LoserCheems LoserCheems requested a review from Copilot July 30, 2025 13:36
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR significantly expands and restructures the API documentation to provide comprehensive coverage of all available functions and interfaces in the Flash Dynamic Mask Attention library. The documentation is transformed from a basic reference to a complete guide with practical examples and troubleshooting information.

Key changes include:

  • Restructured API organization with high-level auto-selection functions, packed variants, and variable length support
  • Added detailed usage examples for standard attention, dynamic masking, grouped-query attention, and variable length sequences
  • Included performance optimization tips and comprehensive troubleshooting section with memory monitoring utilities

@LoserCheems LoserCheems merged commit 97166b6 into main Jul 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants