diff --git a/README.md b/README.md index ae52c6b..d42c204 100644 --- a/README.md +++ b/README.md @@ -293,8 +293,8 @@ We welcome contributions from the community! FSA is an open-source project and w ### How to Contribute -- **Report bugs**: Found a bug? Please [open an issue](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose) -- **Request features**: Have an idea for improvement? [Let us know](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose) +- **Report bugs**: Found a bug? Please [open an issue](https://github.com/flash-algo/flash-sparse-attention/issues/new?template=bug_report.yml) +- **Request features**: Have an idea for improvement? [Let us know](https://github.com/flash-algo/flash-sparse-attention/issues/new?template=feature_request.yml) - **Submit code**: Ready to contribute code? Check our [Contributing Guide](CONTRIBUTING.md) - **Improve docs**: Help us make the documentation better diff --git a/README_zh.md b/README_zh.md index fdbaf73..97a3000 100644 --- a/README_zh.md +++ b/README_zh.md @@ -292,8 +292,8 @@ python benchmarks/grad_equivalence.py ### 如何贡献 -- **报告错误**: 发现了错误?请[提交 issue](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose) -- **功能请求**: 有改进想法?[告诉我们](https://github.com/flash-algo/flash_sparse_attn/issues/new/choose) +- **报告错误**: 发现了错误?请[提交 issue](https://github.com/flash-algo/flash-sparse-attention/issues/new?template=bug_report.yml) +- **功能请求**: 有改进想法?[告诉我们](https://github.com/flash-algo/flash-sparse-attention/issues/new?template=feature_request.yml) - **提交代码**: 准备贡献代码?查看我们的[贡献指南](CONTRIBUTING.md) - **改进文档**: 帮助我们完善文档