We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent fc4f65a commit ccf1900Copy full SHA for ccf1900
backends/cuda/runtime/shims/sdpa.h
@@ -90,7 +90,6 @@ aoti_torch_cuda__scaled_dot_product_flash_attention(
90
*
91
* @return AOTITorchError error code
92
93
- * Note: Currently implemented using flash attention backend as a fallback.
94
*/
95
AOTI_SHIM_EXPORT AOTITorchError
96
aoti_torch_cuda__scaled_dot_product_efficient_attention(
0 commit comments