Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
|
@@ -74,7 +74,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blo
|
|
| 74 |
| [Llama-3.1-Storm-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q8_0.gguf) | Q8_0 | 8.54GB | β
Available | βͺ Static | π¦ No
|
| 75 |
| [Llama-3.1-Storm-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
| 76 |
| [Llama-3.1-Storm-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q4_K.gguf) | Q4_K | 4.92GB | β
Available | π’ IMatrix | π¦ No
|
| 77 |
-
| Llama-3.1-Storm-8B.Q3_K | Q3_K |
|
| 78 |
| Llama-3.1-Storm-8B.Q2_K | Q2_K | - | β³ Processing | π’ IMatrix | -
|
| 79 |
|
| 80 |
|
|
@@ -82,7 +82,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blo
|
|
| 82 |
| Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
|
| 83 |
| -------- | ---------- | --------- | ------ | ------------ | -------- |
|
| 84 |
| [Llama-3.1-Storm-8B.BF16.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.BF16.gguf) | BF16 | 16.07GB | β
Available | βͺ Static | π¦ No
|
| 85 |
-
| Llama-3.1-Storm-8B.FP16 | F16 |
|
| 86 |
| [Llama-3.1-Storm-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q8_0.gguf) | Q8_0 | 8.54GB | β
Available | βͺ Static | π¦ No
|
| 87 |
| [Llama-3.1-Storm-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
| 88 |
| [Llama-3.1-Storm-8B.Q5_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q5_K.gguf) | Q5_K | 5.73GB | β
Available | βͺ Static | π¦ No
|
|
@@ -91,7 +91,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blo
|
|
| 91 |
| Llama-3.1-Storm-8B.Q4_K_S | Q4_K_S | - | β³ Processing | π’ IMatrix | -
|
| 92 |
| Llama-3.1-Storm-8B.IQ4_NL | IQ4_NL | - | β³ Processing | π’ IMatrix | -
|
| 93 |
| Llama-3.1-Storm-8B.IQ4_XS | IQ4_XS | - | β³ Processing | π’ IMatrix | -
|
| 94 |
-
| Llama-3.1-Storm-8B.Q3_K | Q3_K |
|
| 95 |
| Llama-3.1-Storm-8B.Q3_K_L | Q3_K_L | - | β³ Processing | π’ IMatrix | -
|
| 96 |
| Llama-3.1-Storm-8B.Q3_K_S | Q3_K_S | - | β³ Processing | π’ IMatrix | -
|
| 97 |
| Llama-3.1-Storm-8B.IQ3_M | IQ3_M | - | β³ Processing | π’ IMatrix | -
|
|
|
|
| 74 |
| [Llama-3.1-Storm-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q8_0.gguf) | Q8_0 | 8.54GB | β
Available | βͺ Static | π¦ No
|
| 75 |
| [Llama-3.1-Storm-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
| 76 |
| [Llama-3.1-Storm-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q4_K.gguf) | Q4_K | 4.92GB | β
Available | π’ IMatrix | π¦ No
|
| 77 |
+
| [Llama-3.1-Storm-8B.Q3_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q3_K.gguf) | Q3_K | 4.02GB | β
Available | π’ IMatrix | π¦ No
|
| 78 |
| Llama-3.1-Storm-8B.Q2_K | Q2_K | - | β³ Processing | π’ IMatrix | -
|
| 79 |
|
| 80 |
|
|
|
|
| 82 |
| Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
|
| 83 |
| -------- | ---------- | --------- | ------ | ------------ | -------- |
|
| 84 |
| [Llama-3.1-Storm-8B.BF16.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.BF16.gguf) | BF16 | 16.07GB | β
Available | βͺ Static | π¦ No
|
| 85 |
+
| [Llama-3.1-Storm-8B.FP16.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.FP16.gguf) | F16 | 16.07GB | β
Available | βͺ Static | π¦ No
|
| 86 |
| [Llama-3.1-Storm-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q8_0.gguf) | Q8_0 | 8.54GB | β
Available | βͺ Static | π¦ No
|
| 87 |
| [Llama-3.1-Storm-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
| 88 |
| [Llama-3.1-Storm-8B.Q5_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q5_K.gguf) | Q5_K | 5.73GB | β
Available | βͺ Static | π¦ No
|
|
|
|
| 91 |
| Llama-3.1-Storm-8B.Q4_K_S | Q4_K_S | - | β³ Processing | π’ IMatrix | -
|
| 92 |
| Llama-3.1-Storm-8B.IQ4_NL | IQ4_NL | - | β³ Processing | π’ IMatrix | -
|
| 93 |
| Llama-3.1-Storm-8B.IQ4_XS | IQ4_XS | - | β³ Processing | π’ IMatrix | -
|
| 94 |
+
| [Llama-3.1-Storm-8B.Q3_K.gguf](https://huggingface.co/legraphista/Llama-3.1-Storm-8B-IMat-GGUF/blob/main/Llama-3.1-Storm-8B.Q3_K.gguf) | Q3_K | 4.02GB | β
Available | π’ IMatrix | π¦ No
|
| 95 |
| Llama-3.1-Storm-8B.Q3_K_L | Q3_K_L | - | β³ Processing | π’ IMatrix | -
|
| 96 |
| Llama-3.1-Storm-8B.Q3_K_S | Q3_K_S | - | β³ Processing | π’ IMatrix | -
|
| 97 |
| Llama-3.1-Storm-8B.IQ3_M | IQ3_M | - | β³ Processing | π’ IMatrix | -
|