Spaces:
Running
on
Zero
Running
on
Zero
Remove flash_attn (no precompiled wheel for torch 2.6)
Browse files- requirements.txt +2 -2
requirements.txt
CHANGED
|
@@ -27,5 +27,5 @@ insightface==0.7.3
|
|
| 27 |
transformers==4.52.0
|
| 28 |
huggingface_hub
|
| 29 |
ninja
|
| 30 |
-
#
|
| 31 |
-
|
|
|
|
| 27 |
transformers==4.52.0
|
| 28 |
huggingface_hub
|
| 29 |
ninja
|
| 30 |
+
# flash_attn 暂时移除,因为没有 torch 2.6 的预编译版本
|
| 31 |
+
# 模型会使用默认的 attention 实现
|