Skip to content

Conversation

@LoserCheems
Copy link
Collaborator

Provides detailed documentation for the Flash Dynamic Mask Attention API including installation instructions, parameter specifications, usage examples, and troubleshooting guides.

Covers all function parameters with type information, constraints, and behavioral descriptions to help developers integrate the CUDA-accelerated attention implementation effectively.

Includes practical examples for basic usage, performance optimization tips, and debug mode instructions to support both initial adoption and advanced use cases.

Provides detailed documentation for the Flash Dynamic Mask Attention API including installation instructions, parameter specifications, usage examples, and troubleshooting guides.

Covers all function parameters with type information, constraints, and behavioral descriptions to help developers integrate the CUDA-accelerated attention implementation effectively.

Includes practical examples for basic usage, performance optimization tips, and debug mode instructions to support both initial adoption and advanced use cases.
@LoserCheems LoserCheems added the docs Improvements or additions to documentation label Jun 27, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds comprehensive API reference documentation for the Flash Dynamic Mask Attention API. The documentation covers installation instructions, detailed parameter descriptions including type information and constraints, practical usage examples, and performance optimization and debug mode guidelines.

Comments suppressed due to low confidence (1)

docs/api_reference.md:194

  • The 'Returns' section documents the attention weights as 'p', but in the debug example they are named 'attn_weights'. Consider standardizing the naming convention for clarity.
output, softmax_lse, attn_weights, _ = flash_dma.fwd(

@LoserCheems LoserCheems merged commit 6e1d249 into main Jun 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

docs Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants