Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
|
@@ -75,7 +75,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blo
|
|
| 75 |
| [Llama-3.1-Storm-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
| 76 |
| [Llama-3.1-Storm-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q4_K.gguf) | Q4_K | 4.92GB | β
Available | π’ IMatrix | π¦ No
|
| 77 |
| [Llama-3.1-Storm-8B.Q3_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q3_K.gguf) | Q3_K | 4.02GB | β
Available | π’ IMatrix | π¦ No
|
| 78 |
-
| Llama-3.1-Storm-8B.Q2_K | Q2_K |
|
| 79 |
|
| 80 |
|
| 81 |
### All Quants
|
|
@@ -98,7 +98,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blo
|
|
| 98 |
| Llama-3.1-Storm-8B.IQ3_S | IQ3_S | - | β³ Processing | π’ IMatrix | -
|
| 99 |
| Llama-3.1-Storm-8B.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
| 100 |
| Llama-3.1-Storm-8B.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|
| 101 |
-
| Llama-3.1-Storm-8B.Q2_K | Q2_K |
|
| 102 |
| Llama-3.1-Storm-8B.Q2_K_S | Q2_K_S | - | β³ Processing | π’ IMatrix | -
|
| 103 |
| Llama-3.1-Storm-8B.IQ2_M | IQ2_M | - | β³ Processing | π’ IMatrix | -
|
| 104 |
| Llama-3.1-Storm-8B.IQ2_S | IQ2_S | - | β³ Processing | π’ IMatrix | -
|
|
|
|
| 75 |
| [Llama-3.1-Storm-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
| 76 |
| [Llama-3.1-Storm-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q4_K.gguf) | Q4_K | 4.92GB | β
Available | π’ IMatrix | π¦ No
|
| 77 |
| [Llama-3.1-Storm-8B.Q3_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q3_K.gguf) | Q3_K | 4.02GB | β
Available | π’ IMatrix | π¦ No
|
| 78 |
+
| [Llama-3.1-Storm-8B.Q2_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q2_K.gguf) | Q2_K | 3.18GB | β
Available | π’ IMatrix | π¦ No
|
| 79 |
|
| 80 |
|
| 81 |
### All Quants
|
|
|
|
| 98 |
| Llama-3.1-Storm-8B.IQ3_S | IQ3_S | - | β³ Processing | π’ IMatrix | -
|
| 99 |
| Llama-3.1-Storm-8B.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
| 100 |
| Llama-3.1-Storm-8B.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|
| 101 |
+
| [Llama-3.1-Storm-8B.Q2_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q2_K.gguf) | Q2_K | 3.18GB | β
Available | π’ IMatrix | π¦ No
|
| 102 |
| Llama-3.1-Storm-8B.Q2_K_S | Q2_K_S | - | β³ Processing | π’ IMatrix | -
|
| 103 |
| Llama-3.1-Storm-8B.IQ2_M | IQ2_M | - | β³ Processing | π’ IMatrix | -
|
| 104 |
| Llama-3.1-Storm-8B.IQ2_S | IQ2_S | - | β³ Processing | π’ IMatrix | -
|