Whisper-v3 ValueError: Transformers now supports natively BetterTransformer optimizations #1659
Closed
2 of 4 tasks
Labels
bug
Something isn't working
System Info
Nvidia Docker Container 23.12 xFormers 0.0.24+6600003.d20240116 memory_efficient_attention.cutlassF: available memory_efficient_attention.cutlassB: available memory_efficient_attention.decoderF: available memory_efficient_attention.flshattF@v2.3.6: available memory_efficient_attention.flshattB@v2.3.6: available memory_efficient_attention.smallkF: available memory_efficient_attention.smallkB: available memory_efficient_attention.tritonflashattF: unavailable memory_efficient_attention.tritonflashattB: unavailable memory_efficient_attention.triton_splitKF: available indexing.scaled_index_addF: available indexing.scaled_index_addB: available indexing.index_select: available swiglu.dual_gemm_silu: available swiglu.gemm_fused_operand_sum: available swiglu.fused.p.cpp: available is_triton_available: True pytorch.version: 2.2.0a0+81ea7a4 pytorch.cuda: available gpu.compute_capability: 8.9 gpu.name: NVIDIA GeForce RTX 4090 dcgm_profiler: unavailable build.info: available build.cuda_version: 1203 build.python_version: 3.10.12 build.torch_version: 2.2.0a0+81ea7a4 build.env.TORCH_CUDA_ARCH_LIST: 5.2 6.0 6.1 7.0 7.2 7.5 8.0 8.6 8.7 9.0+PTX build.env.XFORMERS_BUILD_TYPE: None build.env.XFORMERS_ENABLE_DEBUG_ASSERTIONS: None build.env.NVCC_FLAGS: None build.env.XFORMERS_PACKAGE_FROM: None build.nvcc_version: 12.3.107 source.privacy: open source
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction (minimal, reproducible, runnable)
use code:
https://huggingface.co/spaces/primeline/whisper-german/blob/main/app.py
Expected behavior
everything is ok.
The text was updated successfully, but these errors were encountered: