Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion CITATION.cff
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ cff-version: "1.2.0"
date-released: 2025-06
message: "If you use this software, please cite it using these metadata."
title: "Flash Sparse Attention: Trainable Dynamic Mask Sparse Attention"
url: "https://github.com/SmallDoges/flash-sparse-attention"
url: "https://github.com/flash-algo/flash-sparse-attention"
authors:
- family-names: Shi
given-names: Jingze
Expand Down
12 changes: 6 additions & 6 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Everyone is welcome to contribute, and we value everybody's contribution. Code c

It also helps us if you spread the word! Reference the library in blog posts about the awesome projects it made possible, shout out on Twitter every time it has helped you, or simply ⭐️ the repository to say thank you.

However you choose to contribute, please be mindful and respect our [code of conduct](https://github.com/SmallDoges/flash-sparse-attention/blob/main/CODE_OF_CONDUCT.md).
However you choose to contribute, please be mindful and respect our [code of conduct](https://github.com/flash-algo/flash-sparse-attention/blob/main/CODE_OF_CONDUCT.md).

## Ways to contribute

Expand All @@ -16,7 +16,7 @@ There are several ways you can contribute to Flash-DMA:
* Contribute to the examples, benchmarks, or documentation.
* Improve CUDA kernel performance.

If you don't know where to start, there is a special [Good First Issue](https://github.com/SmallDoges/flash-sparse-attention/contribute) listing. It will give you a list of open issues that are beginner-friendly and help you start contributing to open-source.
If you don't know where to start, there is a special [Good First Issue](https://github.com/flash-algo/flash-sparse-attention/contribute) listing. It will give you a list of open issues that are beginner-friendly and help you start contributing to open-source.

> All contributions are equally valuable to the community. 🥰

Expand Down Expand Up @@ -81,14 +81,14 @@ You will need basic `git` proficiency to contribute to Flash-DMA. You'll need **

### Development Setup

1. Fork the [repository](https://github.com/SmallDoges/flash-sparse-attention) by clicking on the **Fork** button.
1. Fork the [repository](https://github.com/flash-algo/flash-sparse-attention) by clicking on the **Fork** button.

2. Clone your fork to your local disk, and add the base repository as a remote:

```bash
git clone https://github.com/<your Github handle>/flash-sparse-attention.git
cd flash-sparse-attention
git remote add upstream https://github.com/SmallDoges/flash-sparse-attention.git
git remote add upstream https://github.com/flash-algo/flash-sparse-attention.git
```

3. Create a new branch to hold your development changes:
Expand Down Expand Up @@ -157,7 +157,7 @@ You will need basic `git` proficiency to contribute to Flash-DMA. You'll need **

### Tests

An extensive test suite is included to test the library behavior and performance. Tests can be found in the [tests](https://github.com/SmallDoges/flash-sparse-attention/tree/main/tests) folder and benchmarks in the [benchmarks](https://github.com/SmallDoges/flash-sparse-attention/tree/main/benchmarks) folder.
An extensive test suite is included to test the library behavior and performance. Tests can be found in the [tests](https://github.com/flash-algo/flash-sparse-attention/tree/main/tests) folder and benchmarks in the [benchmarks](https://github.com/flash-algo/flash-sparse-attention/tree/main/benchmarks) folder.

We use `pytest` for testing. From the root of the repository, run:

Expand Down Expand Up @@ -200,6 +200,6 @@ If you discover a security vulnerability, please send an e-mail to the maintaine

## Questions?

If you have questions about contributing, feel free to ask in the [GitHub Discussions](https://github.com/SmallDoges/flash-sparse-attention/discussions) or open an issue.
If you have questions about contributing, feel free to ask in the [GitHub Discussions](https://github.com/flash-algo/flash-sparse-attention/discussions) or open an issue.

Thank you for contributing to Flash Sparse Attention! 🚀
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
<!-- <div align="center">
<img src="./assets/logo.png" alt="SmallDoges" width="100%">
<img src="./assets/logo.png" alt="flash-algo" width="100%">
</div> -->

<div align="center">
Expand Down Expand Up @@ -67,7 +67,7 @@ pip install flash-sparse-attn --no-build-isolation
Alternatively, you can compile and install from source:

```bash
git clone https://github.com/SmallDoges/flash-sparse-attn.git
git clone https://github.com/flash-algo/flash-sparse-attn.git
cd flash-sparse-attn
Comment on lines +70 to 71
Copy link

Copilot AI Nov 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Repository name mismatch: the clone URL uses 'flash-sparse-attn' but should be 'flash-sparse-attention' to match the repository name used in all other URLs (setup.py, pyproject.toml, SECURITY.md, CONTRIBUTING.md, CITATION.cff).

Suggested change
git clone https://github.com/flash-algo/flash-sparse-attn.git
cd flash-sparse-attn
git clone https://github.com/flash-algo/flash-sparse-attention.git
cd flash-sparse-attention

Copilot uses AI. Check for mistakes.
pip install . --no-build-isolation
```
Expand Down Expand Up @@ -293,8 +293,8 @@ We welcome contributions from the community! FSA is an open-source project and w

### How to Contribute

- **Report bugs**: Found a bug? Please [open an issue](https://github.com/SmallDoges/flash_sparse_attn/issues/new/choose)
- **Request features**: Have an idea for improvement? [Let us know](https://github.com/SmallDoges/flash_sparse_attn/issues/new/choose)
- **Report bugs**: Found a bug? Please [open an issue](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose)
- **Request features**: Have an idea for improvement? [Let us know](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose)
Comment on lines +296 to +297
Copy link

Copilot AI Nov 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Repository name mismatch: the URLs use 'flash_sparse_attn' (with underscores) but should be 'flash-sparse-attention' (with hyphens) to match the repository name used consistently in other files.

Suggested change
- **Report bugs**: Found a bug? Please [open an issue](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose)
- **Request features**: Have an idea for improvement? [Let us know](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose)
- **Report bugs**: Found a bug? Please [open an issue](https://github.com/flash-algo/flash-sparse-attention/issues/new/choose)
- **Request features**: Have an idea for improvement? [Let us know](https://github.com/flash-algo/flash-sparse-attention/issues/new/choose)

Copilot uses AI. Check for mistakes.
- **Submit code**: Ready to contribute code? Check our [Contributing Guide](CONTRIBUTING.md)
- **Improve docs**: Help us make the documentation better

Expand Down
8 changes: 4 additions & 4 deletions README_zh.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
<!-- <div align="center">
<img src="./assets/logo.png" alt="SmallDoges" width="100%">
<img src="./assets/logo.png" alt="flash-algo" width="100%">
</div> -->

<div align="center">
Expand Down Expand Up @@ -67,7 +67,7 @@ pip install flash-sparse-attn --no-build-isolation
或者, 您可以从源代码编译和安装:

```bash
git clone https://github.com/SmallDoges/flash-sparse-attn.git
git clone https://github.com/flash-algo/flash-sparse-attn.git
cd flash-sparse-attn
Comment on lines +70 to 71
Copy link

Copilot AI Nov 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Repository name mismatch: the clone URL uses 'flash-sparse-attn' but should be 'flash-sparse-attention' to match the repository name used in all other URLs (setup.py, pyproject.toml, SECURITY.md, CONTRIBUTING.md, CITATION.cff).

Suggested change
git clone https://github.com/flash-algo/flash-sparse-attn.git
cd flash-sparse-attn
git clone https://github.com/flash-algo/flash-sparse-attention.git
cd flash-sparse-attention

Copilot uses AI. Check for mistakes.
pip install . --no-build-isolation
```
Expand Down Expand Up @@ -292,8 +292,8 @@ python benchmarks/grad_equivalence.py

### 如何贡献

- **报告错误**: 发现了错误?请[提交 issue](https://github.com/SmallDoges/flash_sparse_attn/issues/new/choose)
- **功能请求**: 有改进想法?[告诉我们](https://github.com/SmallDoges/flash_sparse_attn/issues/new/choose)
- **报告错误**: 发现了错误?请[提交 issue](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose)
- **功能请求**: 有改进想法?[告诉我们](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose)
Comment on lines +295 to +296
Copy link

Copilot AI Nov 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Repository name mismatch: the URLs use 'flash_sparse_attn' (with underscores) but should be 'flash-sparse-attention' (with hyphens) to match the repository name used consistently in other files.

Suggested change
- **报告错误**: 发现了错误?请[提交 issue](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose)
- **功能请求**: 有改进想法?[告诉我们](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose)
- **报告错误**: 发现了错误?请[提交 issue](https://github.com/flash-algo/flash-sparse-attention/issues/new/choose)
- **功能请求**: 有改进想法?[告诉我们](https://github.com/flash-algo/flash-sparse-attention/issues/new/choose)

Copilot uses AI. Check for mistakes.
- **提交代码**: 准备贡献代码?查看我们的[贡献指南](CONTRIBUTING.md)
- **改进文档**: 帮助我们完善文档

Expand Down
6 changes: 3 additions & 3 deletions SECURITY.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ If you discover a security vulnerability, please report it responsibly:
- Include: Detailed description, reproduction steps, and potential impact

**For general bugs:**
- Use our [GitHub Issues](https://github.com/SmallDoges/flash-sparse-attention/issues)
- Use our [GitHub Issues](https://github.com/flash-algo/flash-sparse-attention/issues)
- Follow our [contributing guidelines](CONTRIBUTING.md)

## Response Timeline
Expand Down Expand Up @@ -108,5 +108,5 @@ For security-related questions or concerns:
- Project maintainers: See [AUTHORS](AUTHORS) file

For general support:
- GitHub Issues: https://github.com/SmallDoges/flash-sparse-attention/issues
- Documentation: https://github.com/SmallDoges/flash-sparse-attention/tree/main/docs/
- GitHub Issues: https://github.com/flash-algo/flash-sparse-attention/issues
- Documentation: https://github.com/flash-algo/flash-sparse-attention/tree/main/docs/
6 changes: 3 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -40,9 +40,9 @@ classifiers = [
]

[project.urls]
Homepage = "https://github.com/SmallDoges/flash-sparse-attention"
Source = "https://github.com/SmallDoges/flash-sparse-attention"
Issues = "https://github.com/SmallDoges/flash-sparse-attention/issues"
Homepage = "https://github.com/flash-algo/flash-sparse-attention"
Source = "https://github.com/flash-algo/flash-sparse-attention"
Issues = "https://github.com/flash-algo/flash-sparse-attention/issues"

[project.optional-dependencies]
triton = [
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
PACKAGE_NAME = "flash_sparse_attn"

BASE_WHEEL_URL = (
"https://github.com/SmallDoges/flash-sparse-attention/releases/download/{tag_name}/{wheel_name}"
"https://github.com/flash-algo/flash-sparse-attention/releases/download/{tag_name}/{wheel_name}"
)

# FORCE_BUILD: Force a fresh build locally, instead of attempting to find prebuilt wheels
Expand Down
Loading