diff --git a/README.md b/README.md index 64ae1a6..a182bbf 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # Flash Dynamic Mask Attention -![Flash-DMA Banner](assets/flash_dmattn_banner.jpg) +![Flash-DMA Banner](assets/flash_dmattn_banner.png) Flash-DMA is a high-performance attention implementation that integrates Flash Attention's memory efficiency with Dynamic Mask Attention's sparse computation capabilities for processing extremely long sequences in transformer models. diff --git a/assets/flash_dmattn_banner.jpg b/assets/flash_dmattn_banner.jpg deleted file mode 100644 index bed8743..0000000 Binary files a/assets/flash_dmattn_banner.jpg and /dev/null differ diff --git a/assets/flash_dmattn_banner.png b/assets/flash_dmattn_banner.png new file mode 100644 index 0000000..d1820ee Binary files /dev/null and b/assets/flash_dmattn_banner.png differ