Spaces:
Running
on
Zero
Running
on
Zero
Use flash_attn 2.7.4.post1 precompiled wheel (torch 2.6 compatible)
Browse files- requirements.txt +3 -2
requirements.txt
CHANGED
|
@@ -27,5 +27,6 @@ insightface==0.7.3
|
|
| 27 |
transformers==4.52.0
|
| 28 |
huggingface_hub
|
| 29 |
ninja
|
| 30 |
-
# flash_attn
|
| 31 |
-
#
|
|
|
|
|
|
| 27 |
transformers==4.52.0
|
| 28 |
huggingface_hub
|
| 29 |
ninja
|
| 30 |
+
# flash_attn 预编译 wheel (torch 2.6 + CUDA 12 + Python 3.10)
|
| 31 |
+
# 参考: https://huggingface.co/spaces/fffiloni/Meigen-MultiTalk/blob/main/requirements.txt
|
| 32 |
+
https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|