Upload folder using huggingface_hub
Browse files- .gitattributes +2 -0
- README.md +495 -252
- SYSTEM_PROMPT.txt +29 -0
- chat_template.jinja +121 -0
- config.json +87 -0
- generation_config.json +7 -0
- model-00001-of-00002.safetensors +3 -0
- model-00002-of-00002.safetensors +3 -0
- model.safetensors.index.json +0 -0
- params.json +51 -0
- preprocessor_config.json +34 -0
- processor_config.json +42 -0
- quantization_config.json +0 -0
- tekken.json +3 -0
- tokenizer.json +3 -0
- tokenizer_config.json +1010 -0
.gitattributes
CHANGED
|
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
tekken.json filter=lfs diff=lfs merge=lfs -text
|
| 37 |
+
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
README.md
CHANGED
|
@@ -1,285 +1,528 @@
|
|
| 1 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2 |
license: apache-2.0
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
|
|
|
|
|
|
|
|
|
| 6 |
tags:
|
| 7 |
-
-
|
| 8 |
---
|
| 9 |
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
|
| 39 |
-
|
| 40 |
-
|
| 41 |
-
|
| 42 |
-
|
| 43 |
-
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
|
| 47 |
-
|
| 48 |
-
|
| 49 |
-
|
| 50 |
-
|
| 51 |
-
|
| 52 |
-
|
| 53 |
-
|
| 54 |
-
|
| 55 |
-
<div>3.00 bpw</div>
|
| 56 |
-
</td>
|
| 57 |
-
</tr>
|
| 58 |
-
<tr>
|
| 59 |
-
<td align="center">
|
| 60 |
-
<a href="https://huggingface.co/tuboderp/Ministral-3-14B-Instruct-2512-exl3/blob/main/3.50bpw.svg">
|
| 61 |
-
<img src="3.50bpw.svg" alt="3.50 bpw" width="160">
|
| 62 |
-
</a>
|
| 63 |
-
<div>3.50 bpw</div>
|
| 64 |
-
</td>
|
| 65 |
-
<td align="center">
|
| 66 |
-
<a href="https://huggingface.co/turboderp/Ministral-3-14B-Instruct-2512-exl3/blob/main/4.00bpw.svg">
|
| 67 |
-
<img src="4.00bpw.svg" alt="4.00 bpw" width="160">
|
| 68 |
-
</a>
|
| 69 |
-
<div>4.00 bpw</div>
|
| 70 |
-
</td>
|
| 71 |
-
<td align="center">
|
| 72 |
-
<a href="https://huggingface.co/turboderp/Ministral-3-14B-Instruct-2512-exl3/blob/main/4.50bpw.svg">
|
| 73 |
-
<img src="4.50bpw.svg" alt="4.50 bpw" width="160">
|
| 74 |
-
</a>
|
| 75 |
-
<div>4.50 bpw</div>
|
| 76 |
-
</td>
|
| 77 |
-
<td align="center">
|
| 78 |
-
<a href="https://huggingface.co/turboderp/Ministral-3-14B-Instruct-2512-exl3/blob/main/5.00bpw.svg">
|
| 79 |
-
<img src="5.00bpw.svg" alt="5.00 bpw" width="160">
|
| 80 |
-
</a>
|
| 81 |
-
<div>5.00 bpw</div>
|
| 82 |
-
</td>
|
| 83 |
-
</tr>
|
| 84 |
-
<tr>
|
| 85 |
-
<td align="center">
|
| 86 |
-
<a href="https://huggingface.co/turboderp/Ministral-3-14B-Instruct-2512-exl3/blob/main/6.00bpw.svg">
|
| 87 |
-
<img src="6.00bpw.svg" alt="6.00 bpw" width="160">
|
| 88 |
-
</a>
|
| 89 |
-
<div>6.00 bpw</div>
|
| 90 |
-
</td>
|
| 91 |
-
<td align="center">
|
| 92 |
-
<a href="https://huggingface.co/turboderp/Ministral-3-14B-Instruct-2512-exl3/blob/main/7.00bpw.svg">
|
| 93 |
-
<img src="7.00bpw.svg" alt="7.00 bpw" width="160">
|
| 94 |
-
</a>
|
| 95 |
-
<div>7.00 bpw</div>
|
| 96 |
-
</td>
|
| 97 |
-
<td align="center">
|
| 98 |
-
<a href="https://huggingface.co/turboderp/Ministral-3-14B-Instruct-2512-exl3/blob/main/8.00bpw.svg">
|
| 99 |
-
<img src="8.00bpw.svg" alt="8.00 bpw" width="160">
|
| 100 |
-
</a>
|
| 101 |
-
<div>8.00 bpw</div>
|
| 102 |
-
</td>
|
| 103 |
-
<td align="center">
|
| 104 |
-
<a href="https://huggingface.co/turboderp/Ministral-3-14B-Instruct-2512-exl3/blob/main/hf.svg">
|
| 105 |
-
<img src="hf.svg" alt="Transformers" width="160">
|
| 106 |
-
</a>
|
| 107 |
-
<div>Transformers</div>
|
| 108 |
-
</td>
|
| 109 |
-
</tr>
|
| 110 |
-
</table>
|
| 111 |
-
# Image captioning
|
| 112 |
-
|
| 113 |
-

|
| 114 |
|
| 115 |
-
|
| 116 |
-
<summary><b>2.00bpw</b></summary>
|
| 117 |
-
This image depicts six cats perched atop and inside a vintage-style suitcase.
|
| 118 |
|
| 119 |
-
|
| 120 |
|
| 121 |
-
|
| 122 |
-
|
| 123 |
-
|
| 124 |
-
|
| 125 |
-
|
| 126 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 127 |
|
| 128 |
-
All the cats have varied expressions, some appearing curious or contemplative as they peer upward. The overall composition of the image gives a whimsical, playful vibe as the cats seem to be ready to embark on an adventure.
|
| 129 |
-
</details>
|
| 130 |
-
<details>
|
| 131 |
-
<summary><b>2.25bpw</b></summary>
|
| 132 |
-
The image shows six adorable kittens posed inside and around a vintage-style suitcase. Here’s a detailed description:
|
| 133 |
-
|
| 134 |
-
1. **Suitcase:**
|
| 135 |
-
- The suitcase appears to be made of metal and has a classic, old-school design with a light brown leather strap and brass handles and fittings.
|
| 136 |
-
- The interior is visible with the kittens sitting inside and one of them hanging its head out from the top opening.
|
| 137 |
-
|
| 138 |
-
2. **Kittens:**
|
| 139 |
-
- The kittens have a variety of markings and colors.
|
| 140 |
-
- Four of the kittens are gray with white markings on their faces and paws.
|
| 141 |
-
- The fifth kitten is predominantly gray with a white face and a more distinct white chest area.
|
| 142 |
-
- The last kitten on the right is mostly black with a white patch on its face.
|
| 143 |
-
- The kittens are well-groomed, clean, and appear to be young and lively.
|
| 144 |
-
|
| 145 |
-
3. **Poses:**
|
| 146 |
-
- Three kittens are positioned inside the suitcase, looking out and sitting on the surface.
|
| 147 |
-
- The middle kitten is sticking its head out from the suitcase while its body is inside.
|
| 148 |
-
- One kitten is sitting outside the suitcase, leaning against the edge.
|
| 149 |
-
- The others are positioned both on top and beside the suitcase.
|
| 150 |
-
|
| 151 |
-
The image is arranged in a playful and aesthetically pleasing manner, likely intended to evoke a sense of adventure and curiosity about where these kittens are going. It’s a charming composition, capturing the innocent and curious nature of kittens.
|
| 152 |
-
</details>
|
| 153 |
-
<details>
|
| 154 |
-
<summary><b>2.50bpw</b></summary>
|
| 155 |
-
This image showcases six cats situated in and around an open vintage-style suitcase. The scene appears to be staged for a creative and humorous purpose.
|
| 156 |
-
|
| 157 |
-
From left to right:
|
| 158 |
-
1. A gray cat with a long tail and fluffy appearance is perched on the edge of the suitcase.
|
| 159 |
-
2. Another gray cat, also fluffy, is sitting just below the first cat, leaning out of the suitcase.
|
| 160 |
-
3. A gray cat with a white patch on its chest and a slightly raised posture is sitting inside the suitcase.
|
| 161 |
-
4. Another cat that looks similar to the third one but has a bit more white on its chest and face is also sitting inside the suitcase.
|
| 162 |
-
5. A gray cat with white markings on its face and chest, featuring a distinctive tufted tail, is positioned inside the suitcase with one paw resting on the edge and looking towards the camera.
|
| 163 |
-
6. A black cat with white paws is sitting inside the suitcase and gazing directly forward.
|
| 164 |
-
|
| 165 |
-
The vintage suitcase has a sleek metallic frame and is adorned with brass studs and wooden accents. It is open, and the cats are positioned in a way that suggests they have just come out from inside it. The image creates a playful juxtaposition between the cats and the suitcase, perhaps evoking a sense of travel or adventure.
|
| 166 |
-
</details>
|
| 167 |
<details>
|
| 168 |
-
<summary
|
| 169 |
-
|
| 170 |
-
|
| 171 |
-
|
| 172 |
-
|
| 173 |
-
|
| 174 |
-
|
| 175 |
-
|
| 176 |
-
|
| 177 |
-
|
| 178 |
-
|
| 179 |
-
|
| 180 |
-
|
| 181 |
-
|
| 182 |
-
|
| 183 |
-
|
| 184 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 185 |
</details>
|
|
|
|
| 186 |
<details>
|
| 187 |
-
<summary
|
| 188 |
-
|
| 189 |
-
|
| 190 |
-
|
| 191 |
-
|
| 192 |
-
|
| 193 |
-
|
| 194 |
-
|
| 195 |
-
|
| 196 |
-
|
| 197 |
-
|
| 198 |
-
|
| 199 |
-
|
| 200 |
-
|
| 201 |
-
|
| 202 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 203 |
</details>
|
|
|
|
| 204 |
<details>
|
| 205 |
-
<summary
|
| 206 |
-
The image shows six cats sitting and standing on and around a vintage-style suitcase. The cats seem to be of similar breeds and colors, likely siblings or closely related, as they share many physical traits. Here is a detailed breakdown of the cats and their positions:
|
| 207 |
|
| 208 |
-
|
| 209 |
-
2. The second cat from the left is perched on the edge of the suitcase with its front paws outside and its body inside the suitcase. It is grey with a white face, paws, and chest.
|
| 210 |
-
3. The third cat from the left is sitting comfortably inside the suitcase. This cat is similar in appearance to the leftmost cat with a grey longhaired coat.
|
| 211 |
-
4. The fourth cat from the left is also sitting inside the suitcase, just beside the third cat. It appears to have a greyish coat with a slightly darker hue compared to the others.
|
| 212 |
-
5. The fifth cat from the left is grey with a white face, paws, and chest, much like the second cat. It is sitting inside the suitcase with its body partially visible.
|
| 213 |
-
6. The rightmost cat is standing next to the suitcase with its full body outside. This cat is solid black in color, contrasting with the other grey cats.
|
| 214 |
|
| 215 |
-
|
| 216 |
-
|
| 217 |
-
|
| 218 |
-
<summary><b>4.50bpw</b></summary>
|
| 219 |
-
This image features six cats sitting and standing atop a vintage-style suitcase. The suitcase appears to be an old-fashioned metal case with brass fittings and brown leather accents. Five of the cats have grey and white fur with long hair, and they are perched at various levels on the suitcase, creating a charming, whimsical composition. One grey-and-white cat is partially inside the open case, another sits on the lid, and three others are positioned on the edges and corners, looking alert and curious.
|
| 220 |
|
| 221 |
-
|
|
|
|
|
|
|
| 222 |
|
| 223 |
-
|
| 224 |
-
|
| 225 |
-
<details>
|
| 226 |
-
<summary><b>5.00bpw</b></summary>
|
| 227 |
-
The image depicts six adorable kittens sitting and climbing on top of and inside a vintage-style suitcase. Here are the details:
|
| 228 |
|
| 229 |
-
|
| 230 |
-
|
| 231 |
-
|
| 232 |
-
|
| 233 |
-
3. A grey tabby kitten with a distinctive white "M"-shaped marking on its forehead.
|
| 234 |
-
4. A grey kitten with a long, straight body and a short tail.
|
| 235 |
-
5. A darker grey kitten with bright yellow eyes, also with a distinctive white "M" shape on its forehead.
|
| 236 |
|
| 237 |
-
|
|
|
|
| 238 |
|
| 239 |
-
The kittens are all looking towards the camera, displaying various shades of grey fur and a mix of tabby patterns and solid colors. Their bright and expressive eyes contribute to the charming and endearing quality of the image.
|
| 240 |
-
</details>
|
| 241 |
-
<details>
|
| 242 |
-
<summary><b>6.00bpw</b></summary>
|
| 243 |
-
This image depicts six cats in a playful and arranged scene atop a vintage-style suitcase and a closed briefcase.
|
| 244 |
|
| 245 |
-
|
| 246 |
-
|
| 247 |
-
|
| 248 |
-
|
| 249 |
-
|
|
|
|
| 250 |
|
| 251 |
-
|
| 252 |
-
- The cats are arranged on two pieces of luggage. One is a vintage-style open suitcase, and the other is a closed briefcase with a similar design.
|
| 253 |
-
- The open suitcase has metal buckles and a handle, along with a fabric or leather interior.
|
| 254 |
-
- The closed briefcase, which appears to be placed on top of the suitcase, also has metal buckles and a handle, with a dark exterior.
|
| 255 |
|
| 256 |
-
|
| 257 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 258 |
|
| 259 |
-
|
| 260 |
-
|
| 261 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 262 |
|
| 263 |
-
This scene suggests a whimsical and light-hearted theme, often associated with cats and their penchant for finding cozy spots to relax. The vintage style of the suitcases adds a touch of charm to the image.
|
| 264 |
</details>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 265 |
<details>
|
| 266 |
-
<summary
|
| 267 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 268 |
|
| 269 |
-
The kittens are a mix of colors: three are gray, two have a mix of black and gray with white markings, and one is predominantly black with some white on its chest and legs. Their eyes are wide and golden, adding an endearing touch to the scene. The background is a plain, light-colored surface, which helps keep the focus on the kittens and the suitcase. The overall atmosphere of the image is charming and playful, capturing the curious nature of kittens.
|
| 270 |
</details>
|
| 271 |
-
<details>
|
| 272 |
-
<summary><b>8.00bpw</b></summary>
|
| 273 |
-
This image showcases a group of six kittens positioned atop and around two vintage-style suitcases.
|
| 274 |
|
| 275 |
-
|
| 276 |
|
| 277 |
-
|
| 278 |
-
- The first two kittens on the left are grey with longer fur.
|
| 279 |
-
- The next two kittens are also grey but have shorter fur and a more solid appearance.
|
| 280 |
-
- The kitten in the middle has a grey body with white paws and chest, and short fur.
|
| 281 |
-
- The fifth kitten is grey with a black and white face.
|
| 282 |
-
- The kitten on the far right is solid black.
|
| 283 |
|
| 284 |
-
|
| 285 |
-
</details>
|
|
|
|
| 1 |
---
|
| 2 |
+
library_name: vllm
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
- fr
|
| 6 |
+
- es
|
| 7 |
+
- de
|
| 8 |
+
- it
|
| 9 |
+
- pt
|
| 10 |
+
- nl
|
| 11 |
+
- zh
|
| 12 |
+
- ja
|
| 13 |
+
- ko
|
| 14 |
+
- ar
|
| 15 |
license: apache-2.0
|
| 16 |
+
inference: false
|
| 17 |
+
base_model:
|
| 18 |
+
- mistralai/Ministral-3-14B-Base-2512
|
| 19 |
+
extra_gated_description: >-
|
| 20 |
+
If you want to learn more about how we process your personal data, please read
|
| 21 |
+
our <a href="https://mistral.ai/terms/">Privacy Policy</a>.
|
| 22 |
tags:
|
| 23 |
+
- mistral-common
|
| 24 |
---
|
| 25 |
|
| 26 |
+
# Ministral 3 14B Instruct 2512
|
| 27 |
+
The largest model in the Ministral 3 family, **Ministral 3 14B** offers frontier capabilities and performance comparable to its larger [Mistral Small 3.2 24B](https://huggingface.co/mistralai/Mistral-Small-3.2-Instruct-2506) counterpart. A powerful and efficient language model with vision capabilities.
|
| 28 |
+
|
| 29 |
+
This model is the instruct post-trained version in **FP8**, fine-tuned for instruction tasks, making it ideal for chat and instruction based use cases.
|
| 30 |
+
|
| 31 |
+
The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 24GB of VRAM in FP8, and less if further quantized.
|
| 32 |
+
|
| 33 |
+
Learn more in our blog post [here](https://mistral.ai/news/mistral-3).
|
| 34 |
+
|
| 35 |
+
## Key Features
|
| 36 |
+
Ministral 3 14B consists of two main architectural components:
|
| 37 |
+
- **13.5B Language Model**
|
| 38 |
+
- **0.4B Vision Encoder**
|
| 39 |
+
|
| 40 |
+
The Ministral 3 14B Instruct model offers the following capabilities:
|
| 41 |
+
- **Vision**: Enables the model to analyze images and provide insights based on visual content, in addition to text.
|
| 42 |
+
- **Multilingual**: Supports dozens of languages, including English, French, Spanish, German, Italian, Portuguese, Dutch, Chinese, Japanese, Korean, Arabic.
|
| 43 |
+
- **System Prompt**: Maintains strong adherence and support for system prompts.
|
| 44 |
+
- **Agentic**: Offers best-in-class agentic capabilities with native function calling and JSON outputting.
|
| 45 |
+
- **Edge-Optimized**: Delivers best-in-class performance at a small scale, deployable anywhere.
|
| 46 |
+
- **Apache 2.0 License**: Open-source license allowing usage and modification for both commercial and non-commercial purposes.
|
| 47 |
+
- **Large Context Window**: Supports a 256k context window.
|
| 48 |
+
|
| 49 |
+
### Use Cases
|
| 50 |
+
Private AI deployments where advanced capabilities meet practical hardware constraints:
|
| 51 |
+
- Private/custom chat and AI assistant deployments in constrained environments
|
| 52 |
+
- Advanced local agentic use cases
|
| 53 |
+
- Fine-tuning and specialization
|
| 54 |
+
- And more...
|
| 55 |
+
|
| 56 |
+
Bringing advanced AI capabilities to most environments.
|
| 57 |
+
|
| 58 |
+
## Ministral 3 Family
|
| 59 |
+
|
| 60 |
+
| Model Name | Type | Precision | Link |
|
| 61 |
+
|--------------------------------|--------------------|-----------|------------------------------------------------------------------------------------------|
|
| 62 |
+
| Ministral 3 3B Base 2512 | Base pre-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Base-2512) |
|
| 63 |
+
| Ministral 3 3B Instruct 2512 | Instruct post-trained | FP8 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Instruct-2512) |
|
| 64 |
+
| Ministral 3 3B Reasoning 2512 | Reasoning capable | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Reasoning-2512) |
|
| 65 |
+
| Ministral 3 8B Base 2512 | Base pre-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Base-2512) |
|
| 66 |
+
| Ministral 3 8B Instruct 2512 | Instruct post-trained | FP8 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Instruct-2512) |
|
| 67 |
+
| Ministral 3 8B Reasoning 2512 | Reasoning capable | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Reasoning-2512) |
|
| 68 |
+
| Ministral 3 14B Base 2512 | Base pre-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-14B-Base-2512) |
|
| 69 |
+
| **Ministral 3 14B Instruct 2512** | **Instruct post-trained** | **FP8** | [**Hugging Face**](https://huggingface.co/mistralai/Ministral-3-14B-Instruct-2512) |
|
| 70 |
+
| Ministral 3 14B Reasoning 2512 | Reasoning capable | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-14B-Reasoning-2512) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 71 |
|
| 72 |
+
Other formats available [here](https://huggingface.co/collections/mistralai/ministral-3-additional-checkpoints).
|
|
|
|
|
|
|
| 73 |
|
| 74 |
+
## Benchmark Results
|
| 75 |
|
| 76 |
+
We compare Ministral 3 to similar sized models.
|
| 77 |
+
|
| 78 |
+
### Reasoning
|
| 79 |
+
|
| 80 |
+
| Model | AIME25 | AIME24 | GPQA Diamond | LiveCodeBench |
|
| 81 |
+
|---------------------------|-------------|-------------|--------------|---------------|
|
| 82 |
+
| **Ministral 3 14B** | <u>0.850</u>| <u>0.898</u>| <u>0.712</u> | <u>0.646</u> |
|
| 83 |
+
| Qwen3-14B (Thinking) | 0.737 | 0.837 | 0.663 | 0.593 |
|
| 84 |
+
| | | | | |
|
| 85 |
+
| **Ministral 3 8B** | 0.787 | <u>0.860</u>| 0.668 | <u>0.616</u> |
|
| 86 |
+
| Qwen3-VL-8B-Thinking | <u>0.798</u>| <u>0.860</u>| <u>0.671</u> | 0.580 |
|
| 87 |
+
| | | | | |
|
| 88 |
+
| **Ministral 3 3B** | <u>0.721</u>| <u>0.775</u>| 0.534 | <u>0.548</u> |
|
| 89 |
+
| Qwen3-VL-4B-Thinking | 0.697 | 0.729 | <u>0.601</u> | 0.513 |
|
| 90 |
+
|
| 91 |
+
### Instruct
|
| 92 |
+
|
| 93 |
+
| Model | Arena Hard | WildBench | MATH Maj@1 | MM MTBench |
|
| 94 |
+
|---------------------------|-------------|------------|-------------|------------------|
|
| 95 |
+
| **Ministral 3 14B** | <u>0.551</u>| <u>68.5</u>| <u>0.904</u>| <u>8.49</u> |
|
| 96 |
+
| Qwen3 14B (Non-Thinking) | 0.427 | 65.1 | 0.870 | NOT MULTIMODAL |
|
| 97 |
+
| Gemma3-12B-Instruct | 0.436 | 63.2 | 0.854 | 6.70 |
|
| 98 |
+
| | | | | |
|
| 99 |
+
| **Ministral 3 8B** | 0.509 | <u>66.8</u>| 0.876 | <u>8.08</u> |
|
| 100 |
+
| Qwen3-VL-8B-Instruct | <u>0.528</u>| 66.3 | <u>0.946</u>| 8.00 |
|
| 101 |
+
| | | | | |
|
| 102 |
+
| **Ministral 3 3B** | 0.305 | <u>56.8</u>| 0.830 | 7.83 |
|
| 103 |
+
| Qwen3-VL-4B-Instruct | <u>0.438</u>| <u>56.8</u>| <u>0.900</u>| <u>8.01</u> |
|
| 104 |
+
| Qwen3-VL-2B-Instruct | 0.163 | 42.2 | 0.786 | 6.36 |
|
| 105 |
+
| Gemma3-4B-Instruct | 0.318 | 49.1 | 0.759 | 5.23 |
|
| 106 |
+
|
| 107 |
+
### Base
|
| 108 |
+
|
| 109 |
+
| Model | Multilingual MMLU | MATH CoT 2-Shot | AGIEval 5-shot | MMLU Redux 5-shot | MMLU 5-shot | TriviaQA 5-shot |
|
| 110 |
+
|---------------------|-------------------|-----------------|----------------|-------------------|-------------|-----------------|
|
| 111 |
+
| **Ministral 3 14B** | 0.742 | <u>0.676</u> | 0.648 | 0.820 | 0.794 | 0.749 |
|
| 112 |
+
| Qwen3 14B Base | <u>0.754</u> | 0.620 | <u>0.661</u> | <u>0.837</u> | <u>0.804</u>| 0.703 |
|
| 113 |
+
| Gemma 3 12B Base | 0.690 | 0.487 | 0.587 | 0.766 | 0.745 | <u>0.788</u> |
|
| 114 |
+
| | | | | | | |
|
| 115 |
+
| **Ministral 3 8B** | <u>0.706</u> | <u>0.626</u> | 0.591 | 0.793 | <u>0.761</u>| <u>0.681</u> |
|
| 116 |
+
| Qwen 3 8B Base | 0.700 | 0.576 | <u>0.596</u> | <u>0.794</u> | 0.760 | 0.639 |
|
| 117 |
+
| | | | | | | |
|
| 118 |
+
| **Ministral 3 3B** | 0.652 | <u>0.601</u> | 0.511 | 0.735 | 0.707 | 0.592 |
|
| 119 |
+
| Qwen 3 4B Base | <u>0.677</u> | 0.405 | <u>0.570</u> | <u>0.759</u> | <u>0.713</u>| 0.530 |
|
| 120 |
+
| Gemma 3 4B Base | 0.516 | 0.294 | 0.430 | 0.626 | 0.589 | <u>0.640</u> |
|
| 121 |
+
|
| 122 |
+
## Usage
|
| 123 |
+
|
| 124 |
+
The model can be used with the following frameworks;
|
| 125 |
+
- [`vllm`](https://github.com/vllm-project/vllm): See [here](#vllm)
|
| 126 |
+
- [`transformers`](https://github.com/huggingface/transformers): See [here](#transformers)
|
| 127 |
+
|
| 128 |
+
### vLLM
|
| 129 |
+
|
| 130 |
+
We recommend using this model with [vLLM](https://github.com/vllm-project/vllm).
|
| 131 |
+
|
| 132 |
+
#### Installation
|
| 133 |
+
|
| 134 |
+
Make sure to install most recent vllm:
|
| 135 |
+
|
| 136 |
+
```
|
| 137 |
+
uv pip install -U vllm \
|
| 138 |
+
--torch-backend=auto \
|
| 139 |
+
--extra-index-url https://wheels.vllm.ai/nightly
|
| 140 |
+
```
|
| 141 |
+
|
| 142 |
+
Doing so should automatically install [`mistral_common >= 1.8.6`](https://github.com/mistralai/mistral-common/releases/tag/v1.8.6).
|
| 143 |
+
|
| 144 |
+
To check:
|
| 145 |
+
```
|
| 146 |
+
python -c "import mistral_common; print(mistral_common.__version__)"
|
| 147 |
+
```
|
| 148 |
+
|
| 149 |
+
You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/blob/main/Dockerfile) or on the [docker hub](https://hub.docker.com/layers/vllm/vllm-openai/latest/images/sha256-de9032a92ffea7b5c007dad80b38fd44aac11eddc31c435f8e52f3b7404bbf39).
|
| 150 |
+
|
| 151 |
+
#### Serve
|
| 152 |
+
|
| 153 |
+
Due to their size and the FP8 format of their weights `Ministral-3-3B-Instruct-2512`, `Ministral-3-8B-Instruct-2512` and `Ministral-3-14B-Instruct-2512` can run on a single 1xH200 GPU.
|
| 154 |
+
|
| 155 |
+
A simple launch command is:
|
| 156 |
+
|
| 157 |
+
```bash
|
| 158 |
+
vllm serve mistralai/Ministral-3-14B-Instruct-2512 \
|
| 159 |
+
--tokenizer_mode mistral --config_format mistral --load_format mistral \
|
| 160 |
+
--enable-auto-tool-choice --tool-call-parser mistral
|
| 161 |
+
```
|
| 162 |
+
|
| 163 |
+
Key parameter notes:
|
| 164 |
+
|
| 165 |
+
* enable-auto-tool-choice: Required when enabling tool usage.
|
| 166 |
+
* tool-call-parser mistral: Required when enabling tool usage.
|
| 167 |
+
|
| 168 |
+
|
| 169 |
+
Additional flags:
|
| 170 |
+
|
| 171 |
+
* You can set `--max-model-len` to preserve memory. By default it is set to `262144` which is quite large but not necessary for most scenarios.
|
| 172 |
+
* You can set `--max-num-batched-tokens` to balance throughput and latency, higher means higher throughput but higher latency.
|
| 173 |
+
|
| 174 |
+
#### Usage of the model
|
| 175 |
+
|
| 176 |
+
Here we asumme that the model `mistralai/Ministral-3-14B-Instruct-2512` is served and you can ping it to the domain `localhost` with the port `8000` which is the default for vLLM.
|
| 177 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 178 |
<details>
|
| 179 |
+
<summary>Vision Reasoning</summary>
|
| 180 |
+
|
| 181 |
+
Let's see if the Ministral 3 knows when to pick a fight !
|
| 182 |
+
|
| 183 |
+
```python
|
| 184 |
+
from datetime import datetime, timedelta
|
| 185 |
+
|
| 186 |
+
from openai import OpenAI
|
| 187 |
+
from huggingface_hub import hf_hub_download
|
| 188 |
+
|
| 189 |
+
# Modify OpenAI's API key and API base to use vLLM's API server.
|
| 190 |
+
openai_api_key = "EMPTY"
|
| 191 |
+
openai_api_base = "http://localhost:8000/v1"
|
| 192 |
+
|
| 193 |
+
TEMP = 0.15
|
| 194 |
+
MAX_TOK = 262144
|
| 195 |
+
|
| 196 |
+
client = OpenAI(
|
| 197 |
+
api_key=openai_api_key,
|
| 198 |
+
base_url=openai_api_base,
|
| 199 |
+
)
|
| 200 |
+
|
| 201 |
+
models = client.models.list()
|
| 202 |
+
model = models.data[0].id
|
| 203 |
+
|
| 204 |
+
|
| 205 |
+
def load_system_prompt(repo_id: str, filename: str) -> str:
|
| 206 |
+
file_path = hf_hub_download(repo_id=repo_id, filename=filename)
|
| 207 |
+
with open(file_path, "r") as file:
|
| 208 |
+
system_prompt = file.read()
|
| 209 |
+
today = datetime.today().strftime("%Y-%m-%d")
|
| 210 |
+
yesterday = (datetime.today() - timedelta(days=1)).strftime("%Y-%m-%d")
|
| 211 |
+
model_name = repo_id.split("/")[-1]
|
| 212 |
+
return system_prompt.format(name=model_name, today=today, yesterday=yesterday)
|
| 213 |
+
|
| 214 |
+
|
| 215 |
+
SYSTEM_PROMPT = load_system_prompt(model, "SYSTEM_PROMPT.txt")
|
| 216 |
+
image_url = "https://static.wikia.nocookie.net/essentialsdocs/images/7/70/Battle.png/revision/latest?cb=20220523172438"
|
| 217 |
+
|
| 218 |
+
messages = [
|
| 219 |
+
{"role": "system", "content": SYSTEM_PROMPT},
|
| 220 |
+
{
|
| 221 |
+
"role": "user",
|
| 222 |
+
"content": [
|
| 223 |
+
{
|
| 224 |
+
"type": "text",
|
| 225 |
+
"text": "What action do you think I should take in this situation? List all the possible actions and explain why you think they are good or bad.",
|
| 226 |
+
},
|
| 227 |
+
{"type": "image_url", "image_url": {"url": image_url}},
|
| 228 |
+
],
|
| 229 |
+
},
|
| 230 |
+
]
|
| 231 |
+
|
| 232 |
+
|
| 233 |
+
response = client.chat.completions.create(
|
| 234 |
+
model=model,
|
| 235 |
+
messages=messages,
|
| 236 |
+
temperature=TEMP,
|
| 237 |
+
max_tokens=MAX_TOK,
|
| 238 |
+
)
|
| 239 |
+
|
| 240 |
+
print(response.choices[0].message.content)
|
| 241 |
+
```
|
| 242 |
+
|
| 243 |
</details>
|
| 244 |
+
|
| 245 |
<details>
|
| 246 |
+
<summary>Function Calling</summary>
|
| 247 |
+
|
| 248 |
+
Let's solve some equations thanks to our simple Python calculator tool.
|
| 249 |
+
|
| 250 |
+
```python
|
| 251 |
+
import json
|
| 252 |
+
from openai import OpenAI
|
| 253 |
+
from huggingface_hub import hf_hub_download
|
| 254 |
+
|
| 255 |
+
# Modify OpenAI's API key and API base to use vLLM's API server.
|
| 256 |
+
openai_api_key = "EMPTY"
|
| 257 |
+
openai_api_base = "http://localhost:8000/v1"
|
| 258 |
+
|
| 259 |
+
TEMP = 0.15
|
| 260 |
+
MAX_TOK = 262144
|
| 261 |
+
|
| 262 |
+
client = OpenAI(
|
| 263 |
+
api_key=openai_api_key,
|
| 264 |
+
base_url=openai_api_base,
|
| 265 |
+
)
|
| 266 |
+
|
| 267 |
+
models = client.models.list()
|
| 268 |
+
model = models.data[0].id
|
| 269 |
+
|
| 270 |
+
|
| 271 |
+
def load_system_prompt(repo_id: str, filename: str) -> str:
|
| 272 |
+
file_path = hf_hub_download(repo_id=repo_id, filename=filename)
|
| 273 |
+
with open(file_path, "r") as file:
|
| 274 |
+
system_prompt = file.read()
|
| 275 |
+
return system_prompt
|
| 276 |
+
|
| 277 |
+
|
| 278 |
+
SYSTEM_PROMPT = load_system_prompt(model, "SYSTEM_PROMPT.txt")
|
| 279 |
+
|
| 280 |
+
image_url = "https://math-coaching.com/img/fiche/46/expressions-mathematiques.jpg"
|
| 281 |
+
|
| 282 |
+
|
| 283 |
+
def my_calculator(expression: str) -> str:
|
| 284 |
+
return str(eval(expression))
|
| 285 |
+
|
| 286 |
+
|
| 287 |
+
tools = [
|
| 288 |
+
{
|
| 289 |
+
"type": "function",
|
| 290 |
+
"function": {
|
| 291 |
+
"name": "my_calculator",
|
| 292 |
+
"description": "A calculator that can evaluate a mathematical expression.",
|
| 293 |
+
"parameters": {
|
| 294 |
+
"type": "object",
|
| 295 |
+
"properties": {
|
| 296 |
+
"expression": {
|
| 297 |
+
"type": "string",
|
| 298 |
+
"description": "The mathematical expression to evaluate.",
|
| 299 |
+
},
|
| 300 |
+
},
|
| 301 |
+
"required": ["expression"],
|
| 302 |
+
},
|
| 303 |
+
},
|
| 304 |
+
},
|
| 305 |
+
{
|
| 306 |
+
"type": "function",
|
| 307 |
+
"function": {
|
| 308 |
+
"name": "rewrite",
|
| 309 |
+
"description": "Rewrite a given text for improved clarity",
|
| 310 |
+
"parameters": {
|
| 311 |
+
"type": "object",
|
| 312 |
+
"properties": {
|
| 313 |
+
"text": {
|
| 314 |
+
"type": "string",
|
| 315 |
+
"description": "The input text to rewrite",
|
| 316 |
+
}
|
| 317 |
+
},
|
| 318 |
+
},
|
| 319 |
+
},
|
| 320 |
+
},
|
| 321 |
+
]
|
| 322 |
+
|
| 323 |
+
messages = [
|
| 324 |
+
{"role": "system", "content": SYSTEM_PROMPT},
|
| 325 |
+
{
|
| 326 |
+
"role": "user",
|
| 327 |
+
"content": [
|
| 328 |
+
{
|
| 329 |
+
"type": "text",
|
| 330 |
+
"text": "Thanks to your calculator, compute the results for the equations that involve numbers displayed in the image.",
|
| 331 |
+
},
|
| 332 |
+
{
|
| 333 |
+
"type": "image_url",
|
| 334 |
+
"image_url": {
|
| 335 |
+
"url": image_url,
|
| 336 |
+
},
|
| 337 |
+
},
|
| 338 |
+
],
|
| 339 |
+
},
|
| 340 |
+
]
|
| 341 |
+
|
| 342 |
+
response = client.chat.completions.create(
|
| 343 |
+
model=model,
|
| 344 |
+
messages=messages,
|
| 345 |
+
temperature=TEMP,
|
| 346 |
+
max_tokens=MAX_TOK,
|
| 347 |
+
tools=tools,
|
| 348 |
+
tool_choice="auto",
|
| 349 |
+
)
|
| 350 |
+
|
| 351 |
+
tool_calls = response.choices[0].message.tool_calls
|
| 352 |
+
|
| 353 |
+
results = []
|
| 354 |
+
for tool_call in tool_calls:
|
| 355 |
+
function_name = tool_call.function.name
|
| 356 |
+
function_args = tool_call.function.arguments
|
| 357 |
+
if function_name == "my_calculator":
|
| 358 |
+
result = my_calculator(**json.loads(function_args))
|
| 359 |
+
results.append(result)
|
| 360 |
+
|
| 361 |
+
messages.append({"role": "assistant", "tool_calls": tool_calls})
|
| 362 |
+
for tool_call, result in zip(tool_calls, results):
|
| 363 |
+
messages.append(
|
| 364 |
+
{
|
| 365 |
+
"role": "tool",
|
| 366 |
+
"tool_call_id": tool_call.id,
|
| 367 |
+
"name": tool_call.function.name,
|
| 368 |
+
"content": result,
|
| 369 |
+
}
|
| 370 |
+
)
|
| 371 |
+
|
| 372 |
+
|
| 373 |
+
response = client.chat.completions.create(
|
| 374 |
+
model=model,
|
| 375 |
+
messages=messages,
|
| 376 |
+
temperature=TEMP,
|
| 377 |
+
max_tokens=MAX_TOK,
|
| 378 |
+
)
|
| 379 |
+
|
| 380 |
+
print(response.choices[0].message.content)
|
| 381 |
+
```
|
| 382 |
+
|
| 383 |
</details>
|
| 384 |
+
|
| 385 |
<details>
|
| 386 |
+
<summary>Text-Only Request</summary>
|
|
|
|
| 387 |
|
| 388 |
+
Ministral 3 can follow your instructions to the letter.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 389 |
|
| 390 |
+
```python
|
| 391 |
+
from openai import OpenAI
|
| 392 |
+
from huggingface_hub import hf_hub_download
|
|
|
|
|
|
|
| 393 |
|
| 394 |
+
# Modify OpenAI's API key and API base to use vLLM's API server.
|
| 395 |
+
openai_api_key = "EMPTY"
|
| 396 |
+
openai_api_base = "http://localhost:8000/v1"
|
| 397 |
|
| 398 |
+
TEMP = 0.15
|
| 399 |
+
MAX_TOK = 262144
|
|
|
|
|
|
|
|
|
|
| 400 |
|
| 401 |
+
client = OpenAI(
|
| 402 |
+
api_key=openai_api_key,
|
| 403 |
+
base_url=openai_api_base,
|
| 404 |
+
)
|
|
|
|
|
|
|
|
|
|
| 405 |
|
| 406 |
+
models = client.models.list()
|
| 407 |
+
model = models.data[0].id
|
| 408 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 409 |
|
| 410 |
+
def load_system_prompt(repo_id: str, filename: str) -> str:
|
| 411 |
+
file_path = hf_hub_download(repo_id=repo_id, filename=filename)
|
| 412 |
+
with open(file_path, "r") as file:
|
| 413 |
+
system_prompt = file.read()
|
| 414 |
+
return system_prompt
|
| 415 |
+
|
| 416 |
|
| 417 |
+
SYSTEM_PROMPT = load_system_prompt(model, "SYSTEM_PROMPT.txt")
|
|
|
|
|
|
|
|
|
|
| 418 |
|
| 419 |
+
messages = [
|
| 420 |
+
{"role": "system", "content": SYSTEM_PROMPT},
|
| 421 |
+
{
|
| 422 |
+
"role": "user",
|
| 423 |
+
"content": "Write me a sentence where every word starts with the next letter in the alphabet - start with 'a' and end with 'z'.",
|
| 424 |
+
},
|
| 425 |
+
]
|
| 426 |
|
| 427 |
+
response = client.chat.completions.create(
|
| 428 |
+
model=model,
|
| 429 |
+
messages=messages,
|
| 430 |
+
temperature=TEMP,
|
| 431 |
+
max_tokens=MAX_TOK,
|
| 432 |
+
)
|
| 433 |
+
|
| 434 |
+
assistant_message = response.choices[0].message.content
|
| 435 |
+
print(assistant_message)
|
| 436 |
+
```
|
| 437 |
|
|
|
|
| 438 |
</details>
|
| 439 |
+
|
| 440 |
+
### Transformers
|
| 441 |
+
|
| 442 |
+
You can also use Ministral 3 14B Instruct 2512 with `Transformers` !
|
| 443 |
+
|
| 444 |
+
Transformers very recently added preliminary support for FP8, so please make sure to install from main:
|
| 445 |
+
|
| 446 |
+
```sh
|
| 447 |
+
uv pip install git+https://github.com/huggingface/transformers
|
| 448 |
+
```
|
| 449 |
+
|
| 450 |
+
To make the best use of our model with `Transformers` make sure to have [installed](https://github.com/mistralai/mistral-common) `mistral-common >= 1.8.6` to use our tokenizer.
|
| 451 |
+
|
| 452 |
+
```bash
|
| 453 |
+
pip install mistral-common --upgrade
|
| 454 |
+
```
|
| 455 |
+
|
| 456 |
+
Try it out by running the following snippet.
|
| 457 |
+
|
| 458 |
+
> [!Tip]
|
| 459 |
+
> By default Transformers will load the checkpoint in FP8 and dequantize it to BF16 on the fly,
|
| 460 |
+
> which means the model currently does not make use of accelerated FP8-kernels.
|
| 461 |
+
> Compatibility with accelerated FP8-kernels is currently worked on and will be available in a couple of weeks.
|
| 462 |
+
> Stay tuned!
|
| 463 |
+
|
| 464 |
<details>
|
| 465 |
+
<summary>Python snippet</summary>
|
| 466 |
+
|
| 467 |
+
```python
|
| 468 |
+
import torch
|
| 469 |
+
from transformers import Mistral3ForConditionalGeneration, MistralCommonBackend
|
| 470 |
+
|
| 471 |
+
model_id = "mistralai/Ministral-3-14B-Instruct-2512"
|
| 472 |
+
|
| 473 |
+
tokenizer = MistralCommonBackend.from_pretrained(model_id)
|
| 474 |
+
model = Mistral3ForConditionalGeneration.from_pretrained(model_id, device_map="auto")
|
| 475 |
+
|
| 476 |
+
image_url = "https://static.wikia.nocookie.net/essentialsdocs/images/7/70/Battle.png/revision/latest?cb=20220523172438"
|
| 477 |
+
|
| 478 |
+
messages = [
|
| 479 |
+
{
|
| 480 |
+
"role": "user",
|
| 481 |
+
"content": [
|
| 482 |
+
{
|
| 483 |
+
"type": "text",
|
| 484 |
+
"text": "What action do you think I should take in this situation? List all the possible actions and explain why you think they are good or bad.",
|
| 485 |
+
},
|
| 486 |
+
{"type": "image_url", "image_url": {"url": image_url}},
|
| 487 |
+
],
|
| 488 |
+
},
|
| 489 |
+
]
|
| 490 |
+
|
| 491 |
+
tokenized = tokenizer.apply_chat_template(messages, return_tensors="pt", return_dict=True)
|
| 492 |
+
|
| 493 |
+
tokenized["input_ids"] = tokenized["input_ids"].to(device="cuda")
|
| 494 |
+
tokenized["pixel_values"] = tokenized["pixel_values"].to(dtype=torch.bfloat16, device="cuda")
|
| 495 |
+
image_sizes = [tokenized["pixel_values"].shape[-2:]]
|
| 496 |
+
|
| 497 |
+
output = model.generate(
|
| 498 |
+
**tokenized,
|
| 499 |
+
image_sizes=image_sizes,
|
| 500 |
+
max_new_tokens=512,
|
| 501 |
+
)[0]
|
| 502 |
+
|
| 503 |
+
decoded_output = tokenizer.decode(output[len(tokenized["input_ids"][0]):])
|
| 504 |
+
print(decoded_output)
|
| 505 |
+
```
|
| 506 |
+
|
| 507 |
+
**Note:**
|
| 508 |
+
|
| 509 |
+
Transformers allows you to automatically convert the checkpoint to Bfloat16. To so simple load the model as follows:
|
| 510 |
+
|
| 511 |
+
```py
|
| 512 |
+
from transformers import Mistral3ForConditionalGeneration, FineGrainedFP8Config
|
| 513 |
+
|
| 514 |
+
model_id = "mistralai/Ministral-3-14B-Instruct-2512"
|
| 515 |
+
model = Mistral3ForConditionalGeneration.from_pretrained(
|
| 516 |
+
model_id,
|
| 517 |
+
device_map="auto",
|
| 518 |
+
quantization_config=FineGrainedFP8Config(dequantize=True)
|
| 519 |
+
)
|
| 520 |
+
```
|
| 521 |
|
|
|
|
| 522 |
</details>
|
|
|
|
|
|
|
|
|
|
| 523 |
|
| 524 |
+
## License
|
| 525 |
|
| 526 |
+
This model is licensed under the [Apache 2.0 License](https://www.apache.org/licenses/LICENSE-2.0.txt).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 527 |
|
| 528 |
+
*You must not use this model in a manner that infringes, misappropriates, or otherwise violates any third party’s rights, including intellectual property rights.*
|
|
|
SYSTEM_PROMPT.txt
ADDED
|
@@ -0,0 +1,29 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
You are Ministral-3-14B-Instruct-2512, a Large Language Model (LLM) created by Mistral AI, a French startup headquartered in Paris.
|
| 2 |
+
You power an AI assistant called Le Chat.
|
| 3 |
+
Your knowledge base was last updated on 2023-10-01.
|
| 4 |
+
The current date is {today}.
|
| 5 |
+
|
| 6 |
+
When you're not sure about some information or when the user's request requires up-to-date or specific data, you must use the available tools to fetch the information. Do not hesitate to use tools whenever they can provide a more accurate or complete response. If no relevant tools are available, then clearly state that you don't have the information and avoid making up anything.
|
| 7 |
+
If the user's question is not clear, ambiguous, or does not provide enough context for you to accurately answer the question, you do not try to answer it right away and you rather ask the user to clarify their request (e.g. "What are some good restaurants around me?" => "Where are you?" or "When is the next flight to Tokyo" => "Where do you travel from?").
|
| 8 |
+
You are always very attentive to dates, in particular you try to resolve dates (e.g. "yesterday" is {yesterday}) and when asked about information at specific dates, you discard information that is at another date.
|
| 9 |
+
You follow these instructions in all languages, and always respond to the user in the language they use or request.
|
| 10 |
+
Next sections describe the capabilities that you have.
|
| 11 |
+
|
| 12 |
+
# WEB BROWSING INSTRUCTIONS
|
| 13 |
+
|
| 14 |
+
You cannot perform any web search or access internet to open URLs, links etc. If it seems like the user is expecting you to do so, you clarify the situation and ask the user to copy paste the text directly in the chat.
|
| 15 |
+
|
| 16 |
+
# MULTI-MODAL INSTRUCTIONS
|
| 17 |
+
|
| 18 |
+
You have the ability to read images, but you cannot generate images. You also cannot transcribe audio files or videos.
|
| 19 |
+
You cannot read nor transcribe audio files or videos.
|
| 20 |
+
|
| 21 |
+
# TOOL CALLING INSTRUCTIONS
|
| 22 |
+
|
| 23 |
+
You may have access to tools that you can use to fetch information or perform actions. You must use these tools in the following situations:
|
| 24 |
+
|
| 25 |
+
1. When the request requires up-to-date information.
|
| 26 |
+
2. When the request requires specific data that you do not have in your knowledge base.
|
| 27 |
+
3. When the request involves actions that you cannot perform without tools.
|
| 28 |
+
|
| 29 |
+
Always prioritize using tools to provide the most accurate and helpful response. If tools are not available, inform the user that you cannot perform the requested action at the moment.
|
chat_template.jinja
ADDED
|
@@ -0,0 +1,121 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{#- Default system message if no system prompt is passed. #}
|
| 2 |
+
{%- set default_system_message = 'You are Ministral-3-14B-Instruct-2512, a Large Language Model (LLM) created by Mistral AI, a French startup headquartered in Paris.\nYou power an AI assistant called Le Chat.\nYour knowledge base was last updated on 2023-10-01.\nThe current date is {today}.\n\nWhen you\'re not sure about some information or when the user\'s request requires up-to-date or specific data, you must use the available tools to fetch the information. Do not hesitate to use tools whenever they can provide a more accurate or complete response. If no relevant tools are available, then clearly state that you don\'t have the information and avoid making up anything.\nIf the user\'s question is not clear, ambiguous, or does not provide enough context for you to accurately answer the question, you do not try to answer it right away and you rather ask the user to clarify their request (e.g. "What are some good restaurants around me?" => "Where are you?" or "When is the next flight to Tokyo" => "Where do you travel from?").\nYou are always very attentive to dates, in particular you try to resolve dates (e.g. "yesterday" is {yesterday}) and when asked about information at specific dates, you discard information that is at another date.\nYou follow these instructions in all languages, and always respond to the user in the language they use or request.\nNext sections describe the capabilities that you have.\n\n# WEB BROWSING INSTRUCTIONS\n\nYou cannot perform any web search or access internet to open URLs, links etc. If it seems like the user is expecting you to do so, you clarify the situation and ask the user to copy paste the text directly in the chat.\n\n# MULTI-MODAL INSTRUCTIONS\n\nYou have the ability to read images, but you cannot generate images. You also cannot transcribe audio files or videos.\nYou cannot read nor transcribe audio files or videos.\n\n# TOOL CALLING INSTRUCTIONS\n\nYou may have access to tools that you can use to fetch information or perform actions. You must use these tools in the following situations:\n\n1. When the request requires up-to-date information.\n2. When the request requires specific data that you do not have in your knowledge base.\n3. When the request involves actions that you cannot perform without tools.\n\nAlways prioritize using tools to provide the most accurate and helpful response. If tools are not available, inform the user that you cannot perform the requested action at the moment.' %}
|
| 3 |
+
|
| 4 |
+
{#- Begin of sequence token. #}
|
| 5 |
+
{{- bos_token }}
|
| 6 |
+
|
| 7 |
+
{#- Handle system prompt if it exists. #}
|
| 8 |
+
{#- System prompt supports text content or text chunks. #}
|
| 9 |
+
{%- if messages[0]['role'] == 'system' %}
|
| 10 |
+
{{- '[SYSTEM_PROMPT]' -}}
|
| 11 |
+
{%- if messages[0]['content'] is string %}
|
| 12 |
+
{{- messages[0]['content'] -}}
|
| 13 |
+
{%- else %}
|
| 14 |
+
{%- for block in messages[0]['content'] %}
|
| 15 |
+
{%- if block['type'] == 'text' %}
|
| 16 |
+
{{- block['text'] }}
|
| 17 |
+
{%- else %}
|
| 18 |
+
{{- raise_exception('Only text chunks are supported in system message contents.') }}
|
| 19 |
+
{%- endif %}
|
| 20 |
+
{%- endfor %}
|
| 21 |
+
{%- endif %}
|
| 22 |
+
{{- '[/SYSTEM_PROMPT]' -}}
|
| 23 |
+
{%- set loop_messages = messages[1:] %}
|
| 24 |
+
{%- else %}
|
| 25 |
+
{%- set loop_messages = messages %}
|
| 26 |
+
{%- if default_system_message != '' %}
|
| 27 |
+
{{- '[SYSTEM_PROMPT]' + default_system_message + '[/SYSTEM_PROMPT]' }}
|
| 28 |
+
{%- endif %}
|
| 29 |
+
{%- endif %}
|
| 30 |
+
|
| 31 |
+
|
| 32 |
+
{#- Tools definition #}
|
| 33 |
+
{%- set tools_definition = '' %}
|
| 34 |
+
{%- set has_tools = false %}
|
| 35 |
+
{%- if tools is defined and tools is not none and tools|length > 0 %}
|
| 36 |
+
{%- set has_tools = true %}
|
| 37 |
+
{%- set tools_definition = '[AVAILABLE_TOOLS]' + (tools| tojson) + '[/AVAILABLE_TOOLS]' %}
|
| 38 |
+
{{- tools_definition }}
|
| 39 |
+
{%- endif %}
|
| 40 |
+
|
| 41 |
+
{#- Checks for alternating user/assistant messages. #}
|
| 42 |
+
{%- set ns = namespace(index=0) %}
|
| 43 |
+
{%- for message in loop_messages %}
|
| 44 |
+
{%- if message.role == 'user' or (message.role == 'assistant' and (message.tool_calls is not defined or message.tool_calls is none or message.tool_calls | length == 0)) %}
|
| 45 |
+
{%- if (message['role'] == 'user') != (ns.index % 2 == 0) %}
|
| 46 |
+
{{- raise_exception('After the optional system message, conversation roles must alternate user and assistant roles except for tool calls and results.') }}
|
| 47 |
+
{%- endif %}
|
| 48 |
+
{%- set ns.index = ns.index + 1 %}
|
| 49 |
+
{%- endif %}
|
| 50 |
+
{%- endfor %}
|
| 51 |
+
|
| 52 |
+
{#- Handle conversation messages. #}
|
| 53 |
+
{%- for message in loop_messages %}
|
| 54 |
+
|
| 55 |
+
{#- User messages supports text content or text and image chunks. #}
|
| 56 |
+
{%- if message['role'] == 'user' %}
|
| 57 |
+
{%- if message['content'] is string %}
|
| 58 |
+
{{- '[INST]' + message['content'] + '[/INST]' }}
|
| 59 |
+
{%- elif message['content'] | length > 0 %}
|
| 60 |
+
{{- '[INST]' }}
|
| 61 |
+
{%- if message['content'] | length == 2 %}
|
| 62 |
+
{%- set blocks = message['content'] | sort(attribute='type') %}
|
| 63 |
+
{%- else %}
|
| 64 |
+
{%- set blocks = message['content'] %}
|
| 65 |
+
{%- endif %}
|
| 66 |
+
{%- for block in blocks %}
|
| 67 |
+
{%- if block['type'] == 'text' %}
|
| 68 |
+
{{- block['text'] }}
|
| 69 |
+
{%- elif block['type'] in ['image', 'image_url'] %}
|
| 70 |
+
{{- '[IMG]' }}
|
| 71 |
+
{%- else %}
|
| 72 |
+
{{- raise_exception('Only text, image and image_url chunks are supported in user message content.') }}
|
| 73 |
+
{%- endif %}
|
| 74 |
+
{%- endfor %}
|
| 75 |
+
{{- '[/INST]' }}
|
| 76 |
+
{%- else %}
|
| 77 |
+
{{- raise_exception('User message must have a string or a list of chunks in content') }}
|
| 78 |
+
{%- endif %}
|
| 79 |
+
|
| 80 |
+
{#- Assistant messages supports text content or text and image chunks. #}
|
| 81 |
+
{%- elif message['role'] == 'assistant' %}
|
| 82 |
+
{%- if (message['content'] is none or message['content'] == '' or message['content']|length == 0) and (message['tool_calls'] is not defined or message['tool_calls'] is none or message['tool_calls']|length == 0) %}
|
| 83 |
+
{{- raise_exception('Assistant message must have a string or a list of chunks in content or a list of tool calls.') }}
|
| 84 |
+
{%- endif %}
|
| 85 |
+
|
| 86 |
+
{%- if message['content'] is string %}
|
| 87 |
+
{{- message['content'] }}
|
| 88 |
+
{%- elif message['content'] | length > 0 %}
|
| 89 |
+
{%- for block in message['content'] %}
|
| 90 |
+
{%- if block['type'] == 'text' %}
|
| 91 |
+
{{- block['text'] }}
|
| 92 |
+
{%- else %}
|
| 93 |
+
{{- raise_exception('Only text chunks are supported in assistant message contents.') }}
|
| 94 |
+
{%- endif %}
|
| 95 |
+
{%- endfor %}
|
| 96 |
+
{%- endif %}
|
| 97 |
+
|
| 98 |
+
{%- if message['tool_calls'] is defined and message['tool_calls'] is not none and message['tool_calls']|length > 0 %}
|
| 99 |
+
{%- for tool in message['tool_calls'] %}
|
| 100 |
+
{%- set arguments = tool['function']['arguments'] %}
|
| 101 |
+
{%- if arguments is not string %}
|
| 102 |
+
{%- set arguments = arguments|tojson|safe %}
|
| 103 |
+
{%- elif arguments == '' %}
|
| 104 |
+
{%- set arguments = '{}' %}
|
| 105 |
+
{%- endif %}
|
| 106 |
+
{{- '[TOOL_CALLS]' + tool['function']['name'] + '[ARGS]' + arguments }}
|
| 107 |
+
{%- endfor %}
|
| 108 |
+
{%- endif %}
|
| 109 |
+
|
| 110 |
+
{#- End of sequence token for each assistant messages. #}
|
| 111 |
+
{{- eos_token }}
|
| 112 |
+
|
| 113 |
+
{#- Tool messages only supports text content. #}
|
| 114 |
+
{%- elif message['role'] == 'tool' %}
|
| 115 |
+
{{- '[TOOL_RESULTS]' + message['content']|string + '[/TOOL_RESULTS]' }}
|
| 116 |
+
|
| 117 |
+
{#- Raise exception for unsupported roles. #}
|
| 118 |
+
{%- else %}
|
| 119 |
+
{{- raise_exception('Only user, assistant and tool roles are supported, got ' + message['role'] + '.') }}
|
| 120 |
+
{%- endif %}
|
| 121 |
+
{%- endfor %}
|
config.json
ADDED
|
@@ -0,0 +1,87 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": [
|
| 3 |
+
"Mistral3ForConditionalGeneration"
|
| 4 |
+
],
|
| 5 |
+
"dtype": "bfloat16",
|
| 6 |
+
"image_token_index": 10,
|
| 7 |
+
"tie_word_embeddings": false,
|
| 8 |
+
"model_type": "mistral3",
|
| 9 |
+
"multimodal_projector_bias": false,
|
| 10 |
+
"projector_hidden_act": "gelu",
|
| 11 |
+
"quantization_config": {
|
| 12 |
+
"quant_method": "exl3",
|
| 13 |
+
"version": "0.0.16",
|
| 14 |
+
"bits": 8.0,
|
| 15 |
+
"head_bits": 6,
|
| 16 |
+
"calibration": {
|
| 17 |
+
"rows": 250,
|
| 18 |
+
"cols": 2048
|
| 19 |
+
},
|
| 20 |
+
"out_scales": "always",
|
| 21 |
+
"codebook": "mcg",
|
| 22 |
+
"original_quantization_config": {
|
| 23 |
+
"activation_scheme": "static",
|
| 24 |
+
"dequantize": false,
|
| 25 |
+
"modules_to_not_convert": [
|
| 26 |
+
"model.vision_tower",
|
| 27 |
+
"model.multi_modal_projector",
|
| 28 |
+
"lm_head",
|
| 29 |
+
"model.vision_tower",
|
| 30 |
+
"model.multi_modal_projector",
|
| 31 |
+
"lm_head"
|
| 32 |
+
],
|
| 33 |
+
"quant_method": "fp8",
|
| 34 |
+
"weight_block_size": null
|
| 35 |
+
}
|
| 36 |
+
},
|
| 37 |
+
"spatial_merge_size": 2,
|
| 38 |
+
"text_config": {
|
| 39 |
+
"attention_dropout": 0.0,
|
| 40 |
+
"head_dim": 128,
|
| 41 |
+
"hidden_act": "silu",
|
| 42 |
+
"hidden_size": 5120,
|
| 43 |
+
"initializer_range": 0.02,
|
| 44 |
+
"intermediate_size": 16384,
|
| 45 |
+
"max_position_embeddings": 262144,
|
| 46 |
+
"model_type": "ministral3",
|
| 47 |
+
"num_attention_heads": 32,
|
| 48 |
+
"num_hidden_layers": 40,
|
| 49 |
+
"num_key_value_heads": 8,
|
| 50 |
+
"rms_norm_eps": 1e-05,
|
| 51 |
+
"rope_parameters": {
|
| 52 |
+
"beta_fast": 32.0,
|
| 53 |
+
"beta_slow": 1.0,
|
| 54 |
+
"factor": 16.0,
|
| 55 |
+
"llama_4_scaling_beta": 0.1,
|
| 56 |
+
"mscale": 1.0,
|
| 57 |
+
"mscale_all_dim": 1.0,
|
| 58 |
+
"original_max_position_embeddings": 16384,
|
| 59 |
+
"rope_theta": 1000000000.0,
|
| 60 |
+
"rope_type": "yarn",
|
| 61 |
+
"type": "yarn"
|
| 62 |
+
},
|
| 63 |
+
"sliding_window": null,
|
| 64 |
+
"use_cache": true,
|
| 65 |
+
"vocab_size": 131072
|
| 66 |
+
},
|
| 67 |
+
"transformers_version": "5.0.0.dev0",
|
| 68 |
+
"vision_config": {
|
| 69 |
+
"attention_dropout": 0.0,
|
| 70 |
+
"head_dim": 64,
|
| 71 |
+
"hidden_act": "silu",
|
| 72 |
+
"hidden_size": 1024,
|
| 73 |
+
"image_size": 1540,
|
| 74 |
+
"initializer_range": 0.02,
|
| 75 |
+
"intermediate_size": 4096,
|
| 76 |
+
"model_type": "pixtral",
|
| 77 |
+
"num_attention_heads": 16,
|
| 78 |
+
"num_channels": 3,
|
| 79 |
+
"num_hidden_layers": 24,
|
| 80 |
+
"patch_size": 14,
|
| 81 |
+
"rope_parameters": {
|
| 82 |
+
"rope_theta": 10000.0,
|
| 83 |
+
"rope_type": "default"
|
| 84 |
+
}
|
| 85 |
+
},
|
| 86 |
+
"vision_feature_layer": -1
|
| 87 |
+
}
|
generation_config.json
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"bos_token_id": 1,
|
| 3 |
+
"eos_token_id": 2,
|
| 4 |
+
"max_length": 262144,
|
| 5 |
+
"pad_token_id": 11,
|
| 6 |
+
"transformers_version": "5.0.0.dev0"
|
| 7 |
+
}
|
model-00001-of-00002.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b2d5010137ee1bfeee53225eabf48c92a13b31421564b30e6c45c2c31a7aa202
|
| 3 |
+
size 8341115140
|
model-00002-of-00002.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:c550b94ced4bf0ec76521645a5f6d87af697dfded5e0a867a1eea3fc66b1847f
|
| 3 |
+
size 6554672936
|
model.safetensors.index.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
params.json
ADDED
|
@@ -0,0 +1,51 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"dim": 5120,
|
| 3 |
+
"n_layers": 40,
|
| 4 |
+
"head_dim": 128,
|
| 5 |
+
"hidden_dim": 16384,
|
| 6 |
+
"n_heads": 32,
|
| 7 |
+
"n_kv_heads": 8,
|
| 8 |
+
"rope_theta": 1000000000.0,
|
| 9 |
+
"norm_eps": 1e-05,
|
| 10 |
+
"vocab_size": 131072,
|
| 11 |
+
"tied_embeddings": false,
|
| 12 |
+
"max_position_embeddings": 262144,
|
| 13 |
+
"llama_4_scaling": {
|
| 14 |
+
"original_max_position_embeddings": 16384,
|
| 15 |
+
"beta": 0.1
|
| 16 |
+
},
|
| 17 |
+
"q_lora_rank": null,
|
| 18 |
+
"qk_rope_head_dim": null,
|
| 19 |
+
"qk_nope_head_dim": null,
|
| 20 |
+
"kv_lora_rank": null,
|
| 21 |
+
"v_head_dim": null,
|
| 22 |
+
"quantization": {
|
| 23 |
+
"qformat_weight": "fp8_e4m3",
|
| 24 |
+
"qscheme_act": "TENSOR"
|
| 25 |
+
},
|
| 26 |
+
"yarn": {
|
| 27 |
+
"original_max_position_embeddings": 16384,
|
| 28 |
+
"factor": 16,
|
| 29 |
+
"apply_scale": false,
|
| 30 |
+
"beta": 32,
|
| 31 |
+
"alpha": 1
|
| 32 |
+
},
|
| 33 |
+
"vision_encoder": {
|
| 34 |
+
"image_token_id": 10,
|
| 35 |
+
"image_break_token_id": 12,
|
| 36 |
+
"image_end_token_id": 13,
|
| 37 |
+
"intermediate_size": 4096,
|
| 38 |
+
"num_hidden_layers": 24,
|
| 39 |
+
"num_attention_heads": 16,
|
| 40 |
+
"mm_projector_id": "patch_merge",
|
| 41 |
+
"spatial_merge_size": 2,
|
| 42 |
+
"hidden_size": 1024,
|
| 43 |
+
"num_channels": 3,
|
| 44 |
+
"image_size": 1540,
|
| 45 |
+
"max_image_size": 1540,
|
| 46 |
+
"patch_size": 14,
|
| 47 |
+
"rope_theta": 10000.0,
|
| 48 |
+
"add_pre_mm_projector_layer_norm": true,
|
| 49 |
+
"adapter_bias": false
|
| 50 |
+
}
|
| 51 |
+
}
|
preprocessor_config.json
ADDED
|
@@ -0,0 +1,34 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"crop_size": null,
|
| 3 |
+
"data_format": "channels_first",
|
| 4 |
+
"device": null,
|
| 5 |
+
"disable_grouping": null,
|
| 6 |
+
"do_center_crop": null,
|
| 7 |
+
"do_convert_rgb": true,
|
| 8 |
+
"do_normalize": true,
|
| 9 |
+
"do_pad": null,
|
| 10 |
+
"do_rescale": true,
|
| 11 |
+
"do_resize": true,
|
| 12 |
+
"image_mean": [
|
| 13 |
+
0.48145466,
|
| 14 |
+
0.4578275,
|
| 15 |
+
0.40821073
|
| 16 |
+
],
|
| 17 |
+
"image_processor_type": "PixtralImageProcessorFast",
|
| 18 |
+
"image_seq_length": null,
|
| 19 |
+
"image_std": [
|
| 20 |
+
0.26862954,
|
| 21 |
+
0.26130258,
|
| 22 |
+
0.27577711
|
| 23 |
+
],
|
| 24 |
+
"input_data_format": null,
|
| 25 |
+
"pad_size": null,
|
| 26 |
+
"patch_size": 14,
|
| 27 |
+
"processor_class": "PixtralProcessor",
|
| 28 |
+
"resample": 3,
|
| 29 |
+
"rescale_factor": 0.00392156862745098,
|
| 30 |
+
"return_tensors": null,
|
| 31 |
+
"size": {
|
| 32 |
+
"longest_edge": 1540
|
| 33 |
+
}
|
| 34 |
+
}
|
processor_config.json
ADDED
|
@@ -0,0 +1,42 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"image_break_token": "[IMG_BREAK]",
|
| 3 |
+
"image_end_token": "[IMG_END]",
|
| 4 |
+
"image_processor": {
|
| 5 |
+
"crop_size": null,
|
| 6 |
+
"data_format": "channels_first",
|
| 7 |
+
"device": null,
|
| 8 |
+
"disable_grouping": null,
|
| 9 |
+
"do_center_crop": null,
|
| 10 |
+
"do_convert_rgb": true,
|
| 11 |
+
"do_normalize": true,
|
| 12 |
+
"do_pad": null,
|
| 13 |
+
"do_rescale": true,
|
| 14 |
+
"do_resize": true,
|
| 15 |
+
"image_mean": [
|
| 16 |
+
0.48145466,
|
| 17 |
+
0.4578275,
|
| 18 |
+
0.40821073
|
| 19 |
+
],
|
| 20 |
+
"image_processor_type": "PixtralImageProcessorFast",
|
| 21 |
+
"image_seq_length": null,
|
| 22 |
+
"image_std": [
|
| 23 |
+
0.26862954,
|
| 24 |
+
0.26130258,
|
| 25 |
+
0.27577711
|
| 26 |
+
],
|
| 27 |
+
"input_data_format": null,
|
| 28 |
+
"pad_size": null,
|
| 29 |
+
"patch_size": 14,
|
| 30 |
+
"processor_class": "PixtralProcessor",
|
| 31 |
+
"resample": 3,
|
| 32 |
+
"rescale_factor": 0.00392156862745098,
|
| 33 |
+
"return_tensors": null,
|
| 34 |
+
"size": {
|
| 35 |
+
"longest_edge": 1540
|
| 36 |
+
}
|
| 37 |
+
},
|
| 38 |
+
"image_token": "[IMG]",
|
| 39 |
+
"patch_size": 14,
|
| 40 |
+
"processor_class": "PixtralProcessor",
|
| 41 |
+
"spatial_merge_size": 2
|
| 42 |
+
}
|
quantization_config.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
tekken.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:e29d19ea32eb7e26e6c0572d57cb7f9eca0f4420e0e0fe6ae1cf3be94da1c0d6
|
| 3 |
+
size 16753777
|
tokenizer.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:286acad9b0e27fce778ac429763536accf618ccb6ed72963b6f94685e531c5c7
|
| 3 |
+
size 17077402
|
tokenizer_config.json
ADDED
|
@@ -0,0 +1,1010 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"additional_special_tokens": null,
|
| 3 |
+
"backend": "tokenizers",
|
| 4 |
+
"extra_special_tokens": [
|
| 5 |
+
"<unk>",
|
| 6 |
+
"<s>",
|
| 7 |
+
"</s>",
|
| 8 |
+
"[INST]",
|
| 9 |
+
"[/INST]",
|
| 10 |
+
"[AVAILABLE_TOOLS]",
|
| 11 |
+
"[/AVAILABLE_TOOLS]",
|
| 12 |
+
"[TOOL_RESULTS]",
|
| 13 |
+
"[/TOOL_RESULTS]",
|
| 14 |
+
"[TOOL_CALLS]",
|
| 15 |
+
"[IMG]",
|
| 16 |
+
"<pad>",
|
| 17 |
+
"[IMG_BREAK]",
|
| 18 |
+
"[IMG_END]",
|
| 19 |
+
"[PREFIX]",
|
| 20 |
+
"[MIDDLE]",
|
| 21 |
+
"[SUFFIX]",
|
| 22 |
+
"[SYSTEM_PROMPT]",
|
| 23 |
+
"[/SYSTEM_PROMPT]",
|
| 24 |
+
"[TOOL_CONTENT]",
|
| 25 |
+
"<SPECIAL_20>",
|
| 26 |
+
"<SPECIAL_21>",
|
| 27 |
+
"<SPECIAL_22>",
|
| 28 |
+
"<SPECIAL_23>",
|
| 29 |
+
"[AUDIO]",
|
| 30 |
+
"[BEGIN_AUDIO]",
|
| 31 |
+
"<SPECIAL_26>",
|
| 32 |
+
"<SPECIAL_27>",
|
| 33 |
+
"<SPECIAL_28>",
|
| 34 |
+
"<SPECIAL_29>",
|
| 35 |
+
"<SPECIAL_30>",
|
| 36 |
+
"<SPECIAL_31>",
|
| 37 |
+
"[ARGS]",
|
| 38 |
+
"[CALL_ID]",
|
| 39 |
+
"[THINK]",
|
| 40 |
+
"[/THINK]",
|
| 41 |
+
"<SPECIAL_36>",
|
| 42 |
+
"<SPECIAL_37>",
|
| 43 |
+
"<SPECIAL_38>",
|
| 44 |
+
"<SPECIAL_39>",
|
| 45 |
+
"<SPECIAL_40>",
|
| 46 |
+
"<SPECIAL_41>",
|
| 47 |
+
"<SPECIAL_42>",
|
| 48 |
+
"<SPECIAL_43>",
|
| 49 |
+
"<SPECIAL_44>",
|
| 50 |
+
"<SPECIAL_45>",
|
| 51 |
+
"<SPECIAL_46>",
|
| 52 |
+
"<SPECIAL_47>",
|
| 53 |
+
"<SPECIAL_48>",
|
| 54 |
+
"<SPECIAL_49>",
|
| 55 |
+
"<SPECIAL_50>",
|
| 56 |
+
"<SPECIAL_51>",
|
| 57 |
+
"<SPECIAL_52>",
|
| 58 |
+
"<SPECIAL_53>",
|
| 59 |
+
"<SPECIAL_54>",
|
| 60 |
+
"<SPECIAL_55>",
|
| 61 |
+
"<SPECIAL_56>",
|
| 62 |
+
"<SPECIAL_57>",
|
| 63 |
+
"<SPECIAL_58>",
|
| 64 |
+
"<SPECIAL_59>",
|
| 65 |
+
"<SPECIAL_60>",
|
| 66 |
+
"<SPECIAL_61>",
|
| 67 |
+
"<SPECIAL_62>",
|
| 68 |
+
"<SPECIAL_63>",
|
| 69 |
+
"<SPECIAL_64>",
|
| 70 |
+
"<SPECIAL_65>",
|
| 71 |
+
"<SPECIAL_66>",
|
| 72 |
+
"<SPECIAL_67>",
|
| 73 |
+
"<SPECIAL_68>",
|
| 74 |
+
"<SPECIAL_69>",
|
| 75 |
+
"<SPECIAL_70>",
|
| 76 |
+
"<SPECIAL_71>",
|
| 77 |
+
"<SPECIAL_72>",
|
| 78 |
+
"<SPECIAL_73>",
|
| 79 |
+
"<SPECIAL_74>",
|
| 80 |
+
"<SPECIAL_75>",
|
| 81 |
+
"<SPECIAL_76>",
|
| 82 |
+
"<SPECIAL_77>",
|
| 83 |
+
"<SPECIAL_78>",
|
| 84 |
+
"<SPECIAL_79>",
|
| 85 |
+
"<SPECIAL_80>",
|
| 86 |
+
"<SPECIAL_81>",
|
| 87 |
+
"<SPECIAL_82>",
|
| 88 |
+
"<SPECIAL_83>",
|
| 89 |
+
"<SPECIAL_84>",
|
| 90 |
+
"<SPECIAL_85>",
|
| 91 |
+
"<SPECIAL_86>",
|
| 92 |
+
"<SPECIAL_87>",
|
| 93 |
+
"<SPECIAL_88>",
|
| 94 |
+
"<SPECIAL_89>",
|
| 95 |
+
"<SPECIAL_90>",
|
| 96 |
+
"<SPECIAL_91>",
|
| 97 |
+
"<SPECIAL_92>",
|
| 98 |
+
"<SPECIAL_93>",
|
| 99 |
+
"<SPECIAL_94>",
|
| 100 |
+
"<SPECIAL_95>",
|
| 101 |
+
"<SPECIAL_96>",
|
| 102 |
+
"<SPECIAL_97>",
|
| 103 |
+
"<SPECIAL_98>",
|
| 104 |
+
"<SPECIAL_99>",
|
| 105 |
+
"<SPECIAL_100>",
|
| 106 |
+
"<SPECIAL_101>",
|
| 107 |
+
"<SPECIAL_102>",
|
| 108 |
+
"<SPECIAL_103>",
|
| 109 |
+
"<SPECIAL_104>",
|
| 110 |
+
"<SPECIAL_105>",
|
| 111 |
+
"<SPECIAL_106>",
|
| 112 |
+
"<SPECIAL_107>",
|
| 113 |
+
"<SPECIAL_108>",
|
| 114 |
+
"<SPECIAL_109>",
|
| 115 |
+
"<SPECIAL_110>",
|
| 116 |
+
"<SPECIAL_111>",
|
| 117 |
+
"<SPECIAL_112>",
|
| 118 |
+
"<SPECIAL_113>",
|
| 119 |
+
"<SPECIAL_114>",
|
| 120 |
+
"<SPECIAL_115>",
|
| 121 |
+
"<SPECIAL_116>",
|
| 122 |
+
"<SPECIAL_117>",
|
| 123 |
+
"<SPECIAL_118>",
|
| 124 |
+
"<SPECIAL_119>",
|
| 125 |
+
"<SPECIAL_120>",
|
| 126 |
+
"<SPECIAL_121>",
|
| 127 |
+
"<SPECIAL_122>",
|
| 128 |
+
"<SPECIAL_123>",
|
| 129 |
+
"<SPECIAL_124>",
|
| 130 |
+
"<SPECIAL_125>",
|
| 131 |
+
"<SPECIAL_126>",
|
| 132 |
+
"<SPECIAL_127>",
|
| 133 |
+
"<SPECIAL_128>",
|
| 134 |
+
"<SPECIAL_129>",
|
| 135 |
+
"<SPECIAL_130>",
|
| 136 |
+
"<SPECIAL_131>",
|
| 137 |
+
"<SPECIAL_132>",
|
| 138 |
+
"<SPECIAL_133>",
|
| 139 |
+
"<SPECIAL_134>",
|
| 140 |
+
"<SPECIAL_135>",
|
| 141 |
+
"<SPECIAL_136>",
|
| 142 |
+
"<SPECIAL_137>",
|
| 143 |
+
"<SPECIAL_138>",
|
| 144 |
+
"<SPECIAL_139>",
|
| 145 |
+
"<SPECIAL_140>",
|
| 146 |
+
"<SPECIAL_141>",
|
| 147 |
+
"<SPECIAL_142>",
|
| 148 |
+
"<SPECIAL_143>",
|
| 149 |
+
"<SPECIAL_144>",
|
| 150 |
+
"<SPECIAL_145>",
|
| 151 |
+
"<SPECIAL_146>",
|
| 152 |
+
"<SPECIAL_147>",
|
| 153 |
+
"<SPECIAL_148>",
|
| 154 |
+
"<SPECIAL_149>",
|
| 155 |
+
"<SPECIAL_150>",
|
| 156 |
+
"<SPECIAL_151>",
|
| 157 |
+
"<SPECIAL_152>",
|
| 158 |
+
"<SPECIAL_153>",
|
| 159 |
+
"<SPECIAL_154>",
|
| 160 |
+
"<SPECIAL_155>",
|
| 161 |
+
"<SPECIAL_156>",
|
| 162 |
+
"<SPECIAL_157>",
|
| 163 |
+
"<SPECIAL_158>",
|
| 164 |
+
"<SPECIAL_159>",
|
| 165 |
+
"<SPECIAL_160>",
|
| 166 |
+
"<SPECIAL_161>",
|
| 167 |
+
"<SPECIAL_162>",
|
| 168 |
+
"<SPECIAL_163>",
|
| 169 |
+
"<SPECIAL_164>",
|
| 170 |
+
"<SPECIAL_165>",
|
| 171 |
+
"<SPECIAL_166>",
|
| 172 |
+
"<SPECIAL_167>",
|
| 173 |
+
"<SPECIAL_168>",
|
| 174 |
+
"<SPECIAL_169>",
|
| 175 |
+
"<SPECIAL_170>",
|
| 176 |
+
"<SPECIAL_171>",
|
| 177 |
+
"<SPECIAL_172>",
|
| 178 |
+
"<SPECIAL_173>",
|
| 179 |
+
"<SPECIAL_174>",
|
| 180 |
+
"<SPECIAL_175>",
|
| 181 |
+
"<SPECIAL_176>",
|
| 182 |
+
"<SPECIAL_177>",
|
| 183 |
+
"<SPECIAL_178>",
|
| 184 |
+
"<SPECIAL_179>",
|
| 185 |
+
"<SPECIAL_180>",
|
| 186 |
+
"<SPECIAL_181>",
|
| 187 |
+
"<SPECIAL_182>",
|
| 188 |
+
"<SPECIAL_183>",
|
| 189 |
+
"<SPECIAL_184>",
|
| 190 |
+
"<SPECIAL_185>",
|
| 191 |
+
"<SPECIAL_186>",
|
| 192 |
+
"<SPECIAL_187>",
|
| 193 |
+
"<SPECIAL_188>",
|
| 194 |
+
"<SPECIAL_189>",
|
| 195 |
+
"<SPECIAL_190>",
|
| 196 |
+
"<SPECIAL_191>",
|
| 197 |
+
"<SPECIAL_192>",
|
| 198 |
+
"<SPECIAL_193>",
|
| 199 |
+
"<SPECIAL_194>",
|
| 200 |
+
"<SPECIAL_195>",
|
| 201 |
+
"<SPECIAL_196>",
|
| 202 |
+
"<SPECIAL_197>",
|
| 203 |
+
"<SPECIAL_198>",
|
| 204 |
+
"<SPECIAL_199>",
|
| 205 |
+
"<SPECIAL_200>",
|
| 206 |
+
"<SPECIAL_201>",
|
| 207 |
+
"<SPECIAL_202>",
|
| 208 |
+
"<SPECIAL_203>",
|
| 209 |
+
"<SPECIAL_204>",
|
| 210 |
+
"<SPECIAL_205>",
|
| 211 |
+
"<SPECIAL_206>",
|
| 212 |
+
"<SPECIAL_207>",
|
| 213 |
+
"<SPECIAL_208>",
|
| 214 |
+
"<SPECIAL_209>",
|
| 215 |
+
"<SPECIAL_210>",
|
| 216 |
+
"<SPECIAL_211>",
|
| 217 |
+
"<SPECIAL_212>",
|
| 218 |
+
"<SPECIAL_213>",
|
| 219 |
+
"<SPECIAL_214>",
|
| 220 |
+
"<SPECIAL_215>",
|
| 221 |
+
"<SPECIAL_216>",
|
| 222 |
+
"<SPECIAL_217>",
|
| 223 |
+
"<SPECIAL_218>",
|
| 224 |
+
"<SPECIAL_219>",
|
| 225 |
+
"<SPECIAL_220>",
|
| 226 |
+
"<SPECIAL_221>",
|
| 227 |
+
"<SPECIAL_222>",
|
| 228 |
+
"<SPECIAL_223>",
|
| 229 |
+
"<SPECIAL_224>",
|
| 230 |
+
"<SPECIAL_225>",
|
| 231 |
+
"<SPECIAL_226>",
|
| 232 |
+
"<SPECIAL_227>",
|
| 233 |
+
"<SPECIAL_228>",
|
| 234 |
+
"<SPECIAL_229>",
|
| 235 |
+
"<SPECIAL_230>",
|
| 236 |
+
"<SPECIAL_231>",
|
| 237 |
+
"<SPECIAL_232>",
|
| 238 |
+
"<SPECIAL_233>",
|
| 239 |
+
"<SPECIAL_234>",
|
| 240 |
+
"<SPECIAL_235>",
|
| 241 |
+
"<SPECIAL_236>",
|
| 242 |
+
"<SPECIAL_237>",
|
| 243 |
+
"<SPECIAL_238>",
|
| 244 |
+
"<SPECIAL_239>",
|
| 245 |
+
"<SPECIAL_240>",
|
| 246 |
+
"<SPECIAL_241>",
|
| 247 |
+
"<SPECIAL_242>",
|
| 248 |
+
"<SPECIAL_243>",
|
| 249 |
+
"<SPECIAL_244>",
|
| 250 |
+
"<SPECIAL_245>",
|
| 251 |
+
"<SPECIAL_246>",
|
| 252 |
+
"<SPECIAL_247>",
|
| 253 |
+
"<SPECIAL_248>",
|
| 254 |
+
"<SPECIAL_249>",
|
| 255 |
+
"<SPECIAL_250>",
|
| 256 |
+
"<SPECIAL_251>",
|
| 257 |
+
"<SPECIAL_252>",
|
| 258 |
+
"<SPECIAL_253>",
|
| 259 |
+
"<SPECIAL_254>",
|
| 260 |
+
"<SPECIAL_255>",
|
| 261 |
+
"<SPECIAL_256>",
|
| 262 |
+
"<SPECIAL_257>",
|
| 263 |
+
"<SPECIAL_258>",
|
| 264 |
+
"<SPECIAL_259>",
|
| 265 |
+
"<SPECIAL_260>",
|
| 266 |
+
"<SPECIAL_261>",
|
| 267 |
+
"<SPECIAL_262>",
|
| 268 |
+
"<SPECIAL_263>",
|
| 269 |
+
"<SPECIAL_264>",
|
| 270 |
+
"<SPECIAL_265>",
|
| 271 |
+
"<SPECIAL_266>",
|
| 272 |
+
"<SPECIAL_267>",
|
| 273 |
+
"<SPECIAL_268>",
|
| 274 |
+
"<SPECIAL_269>",
|
| 275 |
+
"<SPECIAL_270>",
|
| 276 |
+
"<SPECIAL_271>",
|
| 277 |
+
"<SPECIAL_272>",
|
| 278 |
+
"<SPECIAL_273>",
|
| 279 |
+
"<SPECIAL_274>",
|
| 280 |
+
"<SPECIAL_275>",
|
| 281 |
+
"<SPECIAL_276>",
|
| 282 |
+
"<SPECIAL_277>",
|
| 283 |
+
"<SPECIAL_278>",
|
| 284 |
+
"<SPECIAL_279>",
|
| 285 |
+
"<SPECIAL_280>",
|
| 286 |
+
"<SPECIAL_281>",
|
| 287 |
+
"<SPECIAL_282>",
|
| 288 |
+
"<SPECIAL_283>",
|
| 289 |
+
"<SPECIAL_284>",
|
| 290 |
+
"<SPECIAL_285>",
|
| 291 |
+
"<SPECIAL_286>",
|
| 292 |
+
"<SPECIAL_287>",
|
| 293 |
+
"<SPECIAL_288>",
|
| 294 |
+
"<SPECIAL_289>",
|
| 295 |
+
"<SPECIAL_290>",
|
| 296 |
+
"<SPECIAL_291>",
|
| 297 |
+
"<SPECIAL_292>",
|
| 298 |
+
"<SPECIAL_293>",
|
| 299 |
+
"<SPECIAL_294>",
|
| 300 |
+
"<SPECIAL_295>",
|
| 301 |
+
"<SPECIAL_296>",
|
| 302 |
+
"<SPECIAL_297>",
|
| 303 |
+
"<SPECIAL_298>",
|
| 304 |
+
"<SPECIAL_299>",
|
| 305 |
+
"<SPECIAL_300>",
|
| 306 |
+
"<SPECIAL_301>",
|
| 307 |
+
"<SPECIAL_302>",
|
| 308 |
+
"<SPECIAL_303>",
|
| 309 |
+
"<SPECIAL_304>",
|
| 310 |
+
"<SPECIAL_305>",
|
| 311 |
+
"<SPECIAL_306>",
|
| 312 |
+
"<SPECIAL_307>",
|
| 313 |
+
"<SPECIAL_308>",
|
| 314 |
+
"<SPECIAL_309>",
|
| 315 |
+
"<SPECIAL_310>",
|
| 316 |
+
"<SPECIAL_311>",
|
| 317 |
+
"<SPECIAL_312>",
|
| 318 |
+
"<SPECIAL_313>",
|
| 319 |
+
"<SPECIAL_314>",
|
| 320 |
+
"<SPECIAL_315>",
|
| 321 |
+
"<SPECIAL_316>",
|
| 322 |
+
"<SPECIAL_317>",
|
| 323 |
+
"<SPECIAL_318>",
|
| 324 |
+
"<SPECIAL_319>",
|
| 325 |
+
"<SPECIAL_320>",
|
| 326 |
+
"<SPECIAL_321>",
|
| 327 |
+
"<SPECIAL_322>",
|
| 328 |
+
"<SPECIAL_323>",
|
| 329 |
+
"<SPECIAL_324>",
|
| 330 |
+
"<SPECIAL_325>",
|
| 331 |
+
"<SPECIAL_326>",
|
| 332 |
+
"<SPECIAL_327>",
|
| 333 |
+
"<SPECIAL_328>",
|
| 334 |
+
"<SPECIAL_329>",
|
| 335 |
+
"<SPECIAL_330>",
|
| 336 |
+
"<SPECIAL_331>",
|
| 337 |
+
"<SPECIAL_332>",
|
| 338 |
+
"<SPECIAL_333>",
|
| 339 |
+
"<SPECIAL_334>",
|
| 340 |
+
"<SPECIAL_335>",
|
| 341 |
+
"<SPECIAL_336>",
|
| 342 |
+
"<SPECIAL_337>",
|
| 343 |
+
"<SPECIAL_338>",
|
| 344 |
+
"<SPECIAL_339>",
|
| 345 |
+
"<SPECIAL_340>",
|
| 346 |
+
"<SPECIAL_341>",
|
| 347 |
+
"<SPECIAL_342>",
|
| 348 |
+
"<SPECIAL_343>",
|
| 349 |
+
"<SPECIAL_344>",
|
| 350 |
+
"<SPECIAL_345>",
|
| 351 |
+
"<SPECIAL_346>",
|
| 352 |
+
"<SPECIAL_347>",
|
| 353 |
+
"<SPECIAL_348>",
|
| 354 |
+
"<SPECIAL_349>",
|
| 355 |
+
"<SPECIAL_350>",
|
| 356 |
+
"<SPECIAL_351>",
|
| 357 |
+
"<SPECIAL_352>",
|
| 358 |
+
"<SPECIAL_353>",
|
| 359 |
+
"<SPECIAL_354>",
|
| 360 |
+
"<SPECIAL_355>",
|
| 361 |
+
"<SPECIAL_356>",
|
| 362 |
+
"<SPECIAL_357>",
|
| 363 |
+
"<SPECIAL_358>",
|
| 364 |
+
"<SPECIAL_359>",
|
| 365 |
+
"<SPECIAL_360>",
|
| 366 |
+
"<SPECIAL_361>",
|
| 367 |
+
"<SPECIAL_362>",
|
| 368 |
+
"<SPECIAL_363>",
|
| 369 |
+
"<SPECIAL_364>",
|
| 370 |
+
"<SPECIAL_365>",
|
| 371 |
+
"<SPECIAL_366>",
|
| 372 |
+
"<SPECIAL_367>",
|
| 373 |
+
"<SPECIAL_368>",
|
| 374 |
+
"<SPECIAL_369>",
|
| 375 |
+
"<SPECIAL_370>",
|
| 376 |
+
"<SPECIAL_371>",
|
| 377 |
+
"<SPECIAL_372>",
|
| 378 |
+
"<SPECIAL_373>",
|
| 379 |
+
"<SPECIAL_374>",
|
| 380 |
+
"<SPECIAL_375>",
|
| 381 |
+
"<SPECIAL_376>",
|
| 382 |
+
"<SPECIAL_377>",
|
| 383 |
+
"<SPECIAL_378>",
|
| 384 |
+
"<SPECIAL_379>",
|
| 385 |
+
"<SPECIAL_380>",
|
| 386 |
+
"<SPECIAL_381>",
|
| 387 |
+
"<SPECIAL_382>",
|
| 388 |
+
"<SPECIAL_383>",
|
| 389 |
+
"<SPECIAL_384>",
|
| 390 |
+
"<SPECIAL_385>",
|
| 391 |
+
"<SPECIAL_386>",
|
| 392 |
+
"<SPECIAL_387>",
|
| 393 |
+
"<SPECIAL_388>",
|
| 394 |
+
"<SPECIAL_389>",
|
| 395 |
+
"<SPECIAL_390>",
|
| 396 |
+
"<SPECIAL_391>",
|
| 397 |
+
"<SPECIAL_392>",
|
| 398 |
+
"<SPECIAL_393>",
|
| 399 |
+
"<SPECIAL_394>",
|
| 400 |
+
"<SPECIAL_395>",
|
| 401 |
+
"<SPECIAL_396>",
|
| 402 |
+
"<SPECIAL_397>",
|
| 403 |
+
"<SPECIAL_398>",
|
| 404 |
+
"<SPECIAL_399>",
|
| 405 |
+
"<SPECIAL_400>",
|
| 406 |
+
"<SPECIAL_401>",
|
| 407 |
+
"<SPECIAL_402>",
|
| 408 |
+
"<SPECIAL_403>",
|
| 409 |
+
"<SPECIAL_404>",
|
| 410 |
+
"<SPECIAL_405>",
|
| 411 |
+
"<SPECIAL_406>",
|
| 412 |
+
"<SPECIAL_407>",
|
| 413 |
+
"<SPECIAL_408>",
|
| 414 |
+
"<SPECIAL_409>",
|
| 415 |
+
"<SPECIAL_410>",
|
| 416 |
+
"<SPECIAL_411>",
|
| 417 |
+
"<SPECIAL_412>",
|
| 418 |
+
"<SPECIAL_413>",
|
| 419 |
+
"<SPECIAL_414>",
|
| 420 |
+
"<SPECIAL_415>",
|
| 421 |
+
"<SPECIAL_416>",
|
| 422 |
+
"<SPECIAL_417>",
|
| 423 |
+
"<SPECIAL_418>",
|
| 424 |
+
"<SPECIAL_419>",
|
| 425 |
+
"<SPECIAL_420>",
|
| 426 |
+
"<SPECIAL_421>",
|
| 427 |
+
"<SPECIAL_422>",
|
| 428 |
+
"<SPECIAL_423>",
|
| 429 |
+
"<SPECIAL_424>",
|
| 430 |
+
"<SPECIAL_425>",
|
| 431 |
+
"<SPECIAL_426>",
|
| 432 |
+
"<SPECIAL_427>",
|
| 433 |
+
"<SPECIAL_428>",
|
| 434 |
+
"<SPECIAL_429>",
|
| 435 |
+
"<SPECIAL_430>",
|
| 436 |
+
"<SPECIAL_431>",
|
| 437 |
+
"<SPECIAL_432>",
|
| 438 |
+
"<SPECIAL_433>",
|
| 439 |
+
"<SPECIAL_434>",
|
| 440 |
+
"<SPECIAL_435>",
|
| 441 |
+
"<SPECIAL_436>",
|
| 442 |
+
"<SPECIAL_437>",
|
| 443 |
+
"<SPECIAL_438>",
|
| 444 |
+
"<SPECIAL_439>",
|
| 445 |
+
"<SPECIAL_440>",
|
| 446 |
+
"<SPECIAL_441>",
|
| 447 |
+
"<SPECIAL_442>",
|
| 448 |
+
"<SPECIAL_443>",
|
| 449 |
+
"<SPECIAL_444>",
|
| 450 |
+
"<SPECIAL_445>",
|
| 451 |
+
"<SPECIAL_446>",
|
| 452 |
+
"<SPECIAL_447>",
|
| 453 |
+
"<SPECIAL_448>",
|
| 454 |
+
"<SPECIAL_449>",
|
| 455 |
+
"<SPECIAL_450>",
|
| 456 |
+
"<SPECIAL_451>",
|
| 457 |
+
"<SPECIAL_452>",
|
| 458 |
+
"<SPECIAL_453>",
|
| 459 |
+
"<SPECIAL_454>",
|
| 460 |
+
"<SPECIAL_455>",
|
| 461 |
+
"<SPECIAL_456>",
|
| 462 |
+
"<SPECIAL_457>",
|
| 463 |
+
"<SPECIAL_458>",
|
| 464 |
+
"<SPECIAL_459>",
|
| 465 |
+
"<SPECIAL_460>",
|
| 466 |
+
"<SPECIAL_461>",
|
| 467 |
+
"<SPECIAL_462>",
|
| 468 |
+
"<SPECIAL_463>",
|
| 469 |
+
"<SPECIAL_464>",
|
| 470 |
+
"<SPECIAL_465>",
|
| 471 |
+
"<SPECIAL_466>",
|
| 472 |
+
"<SPECIAL_467>",
|
| 473 |
+
"<SPECIAL_468>",
|
| 474 |
+
"<SPECIAL_469>",
|
| 475 |
+
"<SPECIAL_470>",
|
| 476 |
+
"<SPECIAL_471>",
|
| 477 |
+
"<SPECIAL_472>",
|
| 478 |
+
"<SPECIAL_473>",
|
| 479 |
+
"<SPECIAL_474>",
|
| 480 |
+
"<SPECIAL_475>",
|
| 481 |
+
"<SPECIAL_476>",
|
| 482 |
+
"<SPECIAL_477>",
|
| 483 |
+
"<SPECIAL_478>",
|
| 484 |
+
"<SPECIAL_479>",
|
| 485 |
+
"<SPECIAL_480>",
|
| 486 |
+
"<SPECIAL_481>",
|
| 487 |
+
"<SPECIAL_482>",
|
| 488 |
+
"<SPECIAL_483>",
|
| 489 |
+
"<SPECIAL_484>",
|
| 490 |
+
"<SPECIAL_485>",
|
| 491 |
+
"<SPECIAL_486>",
|
| 492 |
+
"<SPECIAL_487>",
|
| 493 |
+
"<SPECIAL_488>",
|
| 494 |
+
"<SPECIAL_489>",
|
| 495 |
+
"<SPECIAL_490>",
|
| 496 |
+
"<SPECIAL_491>",
|
| 497 |
+
"<SPECIAL_492>",
|
| 498 |
+
"<SPECIAL_493>",
|
| 499 |
+
"<SPECIAL_494>",
|
| 500 |
+
"<SPECIAL_495>",
|
| 501 |
+
"<SPECIAL_496>",
|
| 502 |
+
"<SPECIAL_497>",
|
| 503 |
+
"<SPECIAL_498>",
|
| 504 |
+
"<SPECIAL_499>",
|
| 505 |
+
"<SPECIAL_500>",
|
| 506 |
+
"<SPECIAL_501>",
|
| 507 |
+
"<SPECIAL_502>",
|
| 508 |
+
"<SPECIAL_503>",
|
| 509 |
+
"<SPECIAL_504>",
|
| 510 |
+
"<SPECIAL_505>",
|
| 511 |
+
"<SPECIAL_506>",
|
| 512 |
+
"<SPECIAL_507>",
|
| 513 |
+
"<SPECIAL_508>",
|
| 514 |
+
"<SPECIAL_509>",
|
| 515 |
+
"<SPECIAL_510>",
|
| 516 |
+
"<SPECIAL_511>",
|
| 517 |
+
"<SPECIAL_512>",
|
| 518 |
+
"<SPECIAL_513>",
|
| 519 |
+
"<SPECIAL_514>",
|
| 520 |
+
"<SPECIAL_515>",
|
| 521 |
+
"<SPECIAL_516>",
|
| 522 |
+
"<SPECIAL_517>",
|
| 523 |
+
"<SPECIAL_518>",
|
| 524 |
+
"<SPECIAL_519>",
|
| 525 |
+
"<SPECIAL_520>",
|
| 526 |
+
"<SPECIAL_521>",
|
| 527 |
+
"<SPECIAL_522>",
|
| 528 |
+
"<SPECIAL_523>",
|
| 529 |
+
"<SPECIAL_524>",
|
| 530 |
+
"<SPECIAL_525>",
|
| 531 |
+
"<SPECIAL_526>",
|
| 532 |
+
"<SPECIAL_527>",
|
| 533 |
+
"<SPECIAL_528>",
|
| 534 |
+
"<SPECIAL_529>",
|
| 535 |
+
"<SPECIAL_530>",
|
| 536 |
+
"<SPECIAL_531>",
|
| 537 |
+
"<SPECIAL_532>",
|
| 538 |
+
"<SPECIAL_533>",
|
| 539 |
+
"<SPECIAL_534>",
|
| 540 |
+
"<SPECIAL_535>",
|
| 541 |
+
"<SPECIAL_536>",
|
| 542 |
+
"<SPECIAL_537>",
|
| 543 |
+
"<SPECIAL_538>",
|
| 544 |
+
"<SPECIAL_539>",
|
| 545 |
+
"<SPECIAL_540>",
|
| 546 |
+
"<SPECIAL_541>",
|
| 547 |
+
"<SPECIAL_542>",
|
| 548 |
+
"<SPECIAL_543>",
|
| 549 |
+
"<SPECIAL_544>",
|
| 550 |
+
"<SPECIAL_545>",
|
| 551 |
+
"<SPECIAL_546>",
|
| 552 |
+
"<SPECIAL_547>",
|
| 553 |
+
"<SPECIAL_548>",
|
| 554 |
+
"<SPECIAL_549>",
|
| 555 |
+
"<SPECIAL_550>",
|
| 556 |
+
"<SPECIAL_551>",
|
| 557 |
+
"<SPECIAL_552>",
|
| 558 |
+
"<SPECIAL_553>",
|
| 559 |
+
"<SPECIAL_554>",
|
| 560 |
+
"<SPECIAL_555>",
|
| 561 |
+
"<SPECIAL_556>",
|
| 562 |
+
"<SPECIAL_557>",
|
| 563 |
+
"<SPECIAL_558>",
|
| 564 |
+
"<SPECIAL_559>",
|
| 565 |
+
"<SPECIAL_560>",
|
| 566 |
+
"<SPECIAL_561>",
|
| 567 |
+
"<SPECIAL_562>",
|
| 568 |
+
"<SPECIAL_563>",
|
| 569 |
+
"<SPECIAL_564>",
|
| 570 |
+
"<SPECIAL_565>",
|
| 571 |
+
"<SPECIAL_566>",
|
| 572 |
+
"<SPECIAL_567>",
|
| 573 |
+
"<SPECIAL_568>",
|
| 574 |
+
"<SPECIAL_569>",
|
| 575 |
+
"<SPECIAL_570>",
|
| 576 |
+
"<SPECIAL_571>",
|
| 577 |
+
"<SPECIAL_572>",
|
| 578 |
+
"<SPECIAL_573>",
|
| 579 |
+
"<SPECIAL_574>",
|
| 580 |
+
"<SPECIAL_575>",
|
| 581 |
+
"<SPECIAL_576>",
|
| 582 |
+
"<SPECIAL_577>",
|
| 583 |
+
"<SPECIAL_578>",
|
| 584 |
+
"<SPECIAL_579>",
|
| 585 |
+
"<SPECIAL_580>",
|
| 586 |
+
"<SPECIAL_581>",
|
| 587 |
+
"<SPECIAL_582>",
|
| 588 |
+
"<SPECIAL_583>",
|
| 589 |
+
"<SPECIAL_584>",
|
| 590 |
+
"<SPECIAL_585>",
|
| 591 |
+
"<SPECIAL_586>",
|
| 592 |
+
"<SPECIAL_587>",
|
| 593 |
+
"<SPECIAL_588>",
|
| 594 |
+
"<SPECIAL_589>",
|
| 595 |
+
"<SPECIAL_590>",
|
| 596 |
+
"<SPECIAL_591>",
|
| 597 |
+
"<SPECIAL_592>",
|
| 598 |
+
"<SPECIAL_593>",
|
| 599 |
+
"<SPECIAL_594>",
|
| 600 |
+
"<SPECIAL_595>",
|
| 601 |
+
"<SPECIAL_596>",
|
| 602 |
+
"<SPECIAL_597>",
|
| 603 |
+
"<SPECIAL_598>",
|
| 604 |
+
"<SPECIAL_599>",
|
| 605 |
+
"<SPECIAL_600>",
|
| 606 |
+
"<SPECIAL_601>",
|
| 607 |
+
"<SPECIAL_602>",
|
| 608 |
+
"<SPECIAL_603>",
|
| 609 |
+
"<SPECIAL_604>",
|
| 610 |
+
"<SPECIAL_605>",
|
| 611 |
+
"<SPECIAL_606>",
|
| 612 |
+
"<SPECIAL_607>",
|
| 613 |
+
"<SPECIAL_608>",
|
| 614 |
+
"<SPECIAL_609>",
|
| 615 |
+
"<SPECIAL_610>",
|
| 616 |
+
"<SPECIAL_611>",
|
| 617 |
+
"<SPECIAL_612>",
|
| 618 |
+
"<SPECIAL_613>",
|
| 619 |
+
"<SPECIAL_614>",
|
| 620 |
+
"<SPECIAL_615>",
|
| 621 |
+
"<SPECIAL_616>",
|
| 622 |
+
"<SPECIAL_617>",
|
| 623 |
+
"<SPECIAL_618>",
|
| 624 |
+
"<SPECIAL_619>",
|
| 625 |
+
"<SPECIAL_620>",
|
| 626 |
+
"<SPECIAL_621>",
|
| 627 |
+
"<SPECIAL_622>",
|
| 628 |
+
"<SPECIAL_623>",
|
| 629 |
+
"<SPECIAL_624>",
|
| 630 |
+
"<SPECIAL_625>",
|
| 631 |
+
"<SPECIAL_626>",
|
| 632 |
+
"<SPECIAL_627>",
|
| 633 |
+
"<SPECIAL_628>",
|
| 634 |
+
"<SPECIAL_629>",
|
| 635 |
+
"<SPECIAL_630>",
|
| 636 |
+
"<SPECIAL_631>",
|
| 637 |
+
"<SPECIAL_632>",
|
| 638 |
+
"<SPECIAL_633>",
|
| 639 |
+
"<SPECIAL_634>",
|
| 640 |
+
"<SPECIAL_635>",
|
| 641 |
+
"<SPECIAL_636>",
|
| 642 |
+
"<SPECIAL_637>",
|
| 643 |
+
"<SPECIAL_638>",
|
| 644 |
+
"<SPECIAL_639>",
|
| 645 |
+
"<SPECIAL_640>",
|
| 646 |
+
"<SPECIAL_641>",
|
| 647 |
+
"<SPECIAL_642>",
|
| 648 |
+
"<SPECIAL_643>",
|
| 649 |
+
"<SPECIAL_644>",
|
| 650 |
+
"<SPECIAL_645>",
|
| 651 |
+
"<SPECIAL_646>",
|
| 652 |
+
"<SPECIAL_647>",
|
| 653 |
+
"<SPECIAL_648>",
|
| 654 |
+
"<SPECIAL_649>",
|
| 655 |
+
"<SPECIAL_650>",
|
| 656 |
+
"<SPECIAL_651>",
|
| 657 |
+
"<SPECIAL_652>",
|
| 658 |
+
"<SPECIAL_653>",
|
| 659 |
+
"<SPECIAL_654>",
|
| 660 |
+
"<SPECIAL_655>",
|
| 661 |
+
"<SPECIAL_656>",
|
| 662 |
+
"<SPECIAL_657>",
|
| 663 |
+
"<SPECIAL_658>",
|
| 664 |
+
"<SPECIAL_659>",
|
| 665 |
+
"<SPECIAL_660>",
|
| 666 |
+
"<SPECIAL_661>",
|
| 667 |
+
"<SPECIAL_662>",
|
| 668 |
+
"<SPECIAL_663>",
|
| 669 |
+
"<SPECIAL_664>",
|
| 670 |
+
"<SPECIAL_665>",
|
| 671 |
+
"<SPECIAL_666>",
|
| 672 |
+
"<SPECIAL_667>",
|
| 673 |
+
"<SPECIAL_668>",
|
| 674 |
+
"<SPECIAL_669>",
|
| 675 |
+
"<SPECIAL_670>",
|
| 676 |
+
"<SPECIAL_671>",
|
| 677 |
+
"<SPECIAL_672>",
|
| 678 |
+
"<SPECIAL_673>",
|
| 679 |
+
"<SPECIAL_674>",
|
| 680 |
+
"<SPECIAL_675>",
|
| 681 |
+
"<SPECIAL_676>",
|
| 682 |
+
"<SPECIAL_677>",
|
| 683 |
+
"<SPECIAL_678>",
|
| 684 |
+
"<SPECIAL_679>",
|
| 685 |
+
"<SPECIAL_680>",
|
| 686 |
+
"<SPECIAL_681>",
|
| 687 |
+
"<SPECIAL_682>",
|
| 688 |
+
"<SPECIAL_683>",
|
| 689 |
+
"<SPECIAL_684>",
|
| 690 |
+
"<SPECIAL_685>",
|
| 691 |
+
"<SPECIAL_686>",
|
| 692 |
+
"<SPECIAL_687>",
|
| 693 |
+
"<SPECIAL_688>",
|
| 694 |
+
"<SPECIAL_689>",
|
| 695 |
+
"<SPECIAL_690>",
|
| 696 |
+
"<SPECIAL_691>",
|
| 697 |
+
"<SPECIAL_692>",
|
| 698 |
+
"<SPECIAL_693>",
|
| 699 |
+
"<SPECIAL_694>",
|
| 700 |
+
"<SPECIAL_695>",
|
| 701 |
+
"<SPECIAL_696>",
|
| 702 |
+
"<SPECIAL_697>",
|
| 703 |
+
"<SPECIAL_698>",
|
| 704 |
+
"<SPECIAL_699>",
|
| 705 |
+
"<SPECIAL_700>",
|
| 706 |
+
"<SPECIAL_701>",
|
| 707 |
+
"<SPECIAL_702>",
|
| 708 |
+
"<SPECIAL_703>",
|
| 709 |
+
"<SPECIAL_704>",
|
| 710 |
+
"<SPECIAL_705>",
|
| 711 |
+
"<SPECIAL_706>",
|
| 712 |
+
"<SPECIAL_707>",
|
| 713 |
+
"<SPECIAL_708>",
|
| 714 |
+
"<SPECIAL_709>",
|
| 715 |
+
"<SPECIAL_710>",
|
| 716 |
+
"<SPECIAL_711>",
|
| 717 |
+
"<SPECIAL_712>",
|
| 718 |
+
"<SPECIAL_713>",
|
| 719 |
+
"<SPECIAL_714>",
|
| 720 |
+
"<SPECIAL_715>",
|
| 721 |
+
"<SPECIAL_716>",
|
| 722 |
+
"<SPECIAL_717>",
|
| 723 |
+
"<SPECIAL_718>",
|
| 724 |
+
"<SPECIAL_719>",
|
| 725 |
+
"<SPECIAL_720>",
|
| 726 |
+
"<SPECIAL_721>",
|
| 727 |
+
"<SPECIAL_722>",
|
| 728 |
+
"<SPECIAL_723>",
|
| 729 |
+
"<SPECIAL_724>",
|
| 730 |
+
"<SPECIAL_725>",
|
| 731 |
+
"<SPECIAL_726>",
|
| 732 |
+
"<SPECIAL_727>",
|
| 733 |
+
"<SPECIAL_728>",
|
| 734 |
+
"<SPECIAL_729>",
|
| 735 |
+
"<SPECIAL_730>",
|
| 736 |
+
"<SPECIAL_731>",
|
| 737 |
+
"<SPECIAL_732>",
|
| 738 |
+
"<SPECIAL_733>",
|
| 739 |
+
"<SPECIAL_734>",
|
| 740 |
+
"<SPECIAL_735>",
|
| 741 |
+
"<SPECIAL_736>",
|
| 742 |
+
"<SPECIAL_737>",
|
| 743 |
+
"<SPECIAL_738>",
|
| 744 |
+
"<SPECIAL_739>",
|
| 745 |
+
"<SPECIAL_740>",
|
| 746 |
+
"<SPECIAL_741>",
|
| 747 |
+
"<SPECIAL_742>",
|
| 748 |
+
"<SPECIAL_743>",
|
| 749 |
+
"<SPECIAL_744>",
|
| 750 |
+
"<SPECIAL_745>",
|
| 751 |
+
"<SPECIAL_746>",
|
| 752 |
+
"<SPECIAL_747>",
|
| 753 |
+
"<SPECIAL_748>",
|
| 754 |
+
"<SPECIAL_749>",
|
| 755 |
+
"<SPECIAL_750>",
|
| 756 |
+
"<SPECIAL_751>",
|
| 757 |
+
"<SPECIAL_752>",
|
| 758 |
+
"<SPECIAL_753>",
|
| 759 |
+
"<SPECIAL_754>",
|
| 760 |
+
"<SPECIAL_755>",
|
| 761 |
+
"<SPECIAL_756>",
|
| 762 |
+
"<SPECIAL_757>",
|
| 763 |
+
"<SPECIAL_758>",
|
| 764 |
+
"<SPECIAL_759>",
|
| 765 |
+
"<SPECIAL_760>",
|
| 766 |
+
"<SPECIAL_761>",
|
| 767 |
+
"<SPECIAL_762>",
|
| 768 |
+
"<SPECIAL_763>",
|
| 769 |
+
"<SPECIAL_764>",
|
| 770 |
+
"<SPECIAL_765>",
|
| 771 |
+
"<SPECIAL_766>",
|
| 772 |
+
"<SPECIAL_767>",
|
| 773 |
+
"<SPECIAL_768>",
|
| 774 |
+
"<SPECIAL_769>",
|
| 775 |
+
"<SPECIAL_770>",
|
| 776 |
+
"<SPECIAL_771>",
|
| 777 |
+
"<SPECIAL_772>",
|
| 778 |
+
"<SPECIAL_773>",
|
| 779 |
+
"<SPECIAL_774>",
|
| 780 |
+
"<SPECIAL_775>",
|
| 781 |
+
"<SPECIAL_776>",
|
| 782 |
+
"<SPECIAL_777>",
|
| 783 |
+
"<SPECIAL_778>",
|
| 784 |
+
"<SPECIAL_779>",
|
| 785 |
+
"<SPECIAL_780>",
|
| 786 |
+
"<SPECIAL_781>",
|
| 787 |
+
"<SPECIAL_782>",
|
| 788 |
+
"<SPECIAL_783>",
|
| 789 |
+
"<SPECIAL_784>",
|
| 790 |
+
"<SPECIAL_785>",
|
| 791 |
+
"<SPECIAL_786>",
|
| 792 |
+
"<SPECIAL_787>",
|
| 793 |
+
"<SPECIAL_788>",
|
| 794 |
+
"<SPECIAL_789>",
|
| 795 |
+
"<SPECIAL_790>",
|
| 796 |
+
"<SPECIAL_791>",
|
| 797 |
+
"<SPECIAL_792>",
|
| 798 |
+
"<SPECIAL_793>",
|
| 799 |
+
"<SPECIAL_794>",
|
| 800 |
+
"<SPECIAL_795>",
|
| 801 |
+
"<SPECIAL_796>",
|
| 802 |
+
"<SPECIAL_797>",
|
| 803 |
+
"<SPECIAL_798>",
|
| 804 |
+
"<SPECIAL_799>",
|
| 805 |
+
"<SPECIAL_800>",
|
| 806 |
+
"<SPECIAL_801>",
|
| 807 |
+
"<SPECIAL_802>",
|
| 808 |
+
"<SPECIAL_803>",
|
| 809 |
+
"<SPECIAL_804>",
|
| 810 |
+
"<SPECIAL_805>",
|
| 811 |
+
"<SPECIAL_806>",
|
| 812 |
+
"<SPECIAL_807>",
|
| 813 |
+
"<SPECIAL_808>",
|
| 814 |
+
"<SPECIAL_809>",
|
| 815 |
+
"<SPECIAL_810>",
|
| 816 |
+
"<SPECIAL_811>",
|
| 817 |
+
"<SPECIAL_812>",
|
| 818 |
+
"<SPECIAL_813>",
|
| 819 |
+
"<SPECIAL_814>",
|
| 820 |
+
"<SPECIAL_815>",
|
| 821 |
+
"<SPECIAL_816>",
|
| 822 |
+
"<SPECIAL_817>",
|
| 823 |
+
"<SPECIAL_818>",
|
| 824 |
+
"<SPECIAL_819>",
|
| 825 |
+
"<SPECIAL_820>",
|
| 826 |
+
"<SPECIAL_821>",
|
| 827 |
+
"<SPECIAL_822>",
|
| 828 |
+
"<SPECIAL_823>",
|
| 829 |
+
"<SPECIAL_824>",
|
| 830 |
+
"<SPECIAL_825>",
|
| 831 |
+
"<SPECIAL_826>",
|
| 832 |
+
"<SPECIAL_827>",
|
| 833 |
+
"<SPECIAL_828>",
|
| 834 |
+
"<SPECIAL_829>",
|
| 835 |
+
"<SPECIAL_830>",
|
| 836 |
+
"<SPECIAL_831>",
|
| 837 |
+
"<SPECIAL_832>",
|
| 838 |
+
"<SPECIAL_833>",
|
| 839 |
+
"<SPECIAL_834>",
|
| 840 |
+
"<SPECIAL_835>",
|
| 841 |
+
"<SPECIAL_836>",
|
| 842 |
+
"<SPECIAL_837>",
|
| 843 |
+
"<SPECIAL_838>",
|
| 844 |
+
"<SPECIAL_839>",
|
| 845 |
+
"<SPECIAL_840>",
|
| 846 |
+
"<SPECIAL_841>",
|
| 847 |
+
"<SPECIAL_842>",
|
| 848 |
+
"<SPECIAL_843>",
|
| 849 |
+
"<SPECIAL_844>",
|
| 850 |
+
"<SPECIAL_845>",
|
| 851 |
+
"<SPECIAL_846>",
|
| 852 |
+
"<SPECIAL_847>",
|
| 853 |
+
"<SPECIAL_848>",
|
| 854 |
+
"<SPECIAL_849>",
|
| 855 |
+
"<SPECIAL_850>",
|
| 856 |
+
"<SPECIAL_851>",
|
| 857 |
+
"<SPECIAL_852>",
|
| 858 |
+
"<SPECIAL_853>",
|
| 859 |
+
"<SPECIAL_854>",
|
| 860 |
+
"<SPECIAL_855>",
|
| 861 |
+
"<SPECIAL_856>",
|
| 862 |
+
"<SPECIAL_857>",
|
| 863 |
+
"<SPECIAL_858>",
|
| 864 |
+
"<SPECIAL_859>",
|
| 865 |
+
"<SPECIAL_860>",
|
| 866 |
+
"<SPECIAL_861>",
|
| 867 |
+
"<SPECIAL_862>",
|
| 868 |
+
"<SPECIAL_863>",
|
| 869 |
+
"<SPECIAL_864>",
|
| 870 |
+
"<SPECIAL_865>",
|
| 871 |
+
"<SPECIAL_866>",
|
| 872 |
+
"<SPECIAL_867>",
|
| 873 |
+
"<SPECIAL_868>",
|
| 874 |
+
"<SPECIAL_869>",
|
| 875 |
+
"<SPECIAL_870>",
|
| 876 |
+
"<SPECIAL_871>",
|
| 877 |
+
"<SPECIAL_872>",
|
| 878 |
+
"<SPECIAL_873>",
|
| 879 |
+
"<SPECIAL_874>",
|
| 880 |
+
"<SPECIAL_875>",
|
| 881 |
+
"<SPECIAL_876>",
|
| 882 |
+
"<SPECIAL_877>",
|
| 883 |
+
"<SPECIAL_878>",
|
| 884 |
+
"<SPECIAL_879>",
|
| 885 |
+
"<SPECIAL_880>",
|
| 886 |
+
"<SPECIAL_881>",
|
| 887 |
+
"<SPECIAL_882>",
|
| 888 |
+
"<SPECIAL_883>",
|
| 889 |
+
"<SPECIAL_884>",
|
| 890 |
+
"<SPECIAL_885>",
|
| 891 |
+
"<SPECIAL_886>",
|
| 892 |
+
"<SPECIAL_887>",
|
| 893 |
+
"<SPECIAL_888>",
|
| 894 |
+
"<SPECIAL_889>",
|
| 895 |
+
"<SPECIAL_890>",
|
| 896 |
+
"<SPECIAL_891>",
|
| 897 |
+
"<SPECIAL_892>",
|
| 898 |
+
"<SPECIAL_893>",
|
| 899 |
+
"<SPECIAL_894>",
|
| 900 |
+
"<SPECIAL_895>",
|
| 901 |
+
"<SPECIAL_896>",
|
| 902 |
+
"<SPECIAL_897>",
|
| 903 |
+
"<SPECIAL_898>",
|
| 904 |
+
"<SPECIAL_899>",
|
| 905 |
+
"<SPECIAL_900>",
|
| 906 |
+
"<SPECIAL_901>",
|
| 907 |
+
"<SPECIAL_902>",
|
| 908 |
+
"<SPECIAL_903>",
|
| 909 |
+
"<SPECIAL_904>",
|
| 910 |
+
"<SPECIAL_905>",
|
| 911 |
+
"<SPECIAL_906>",
|
| 912 |
+
"<SPECIAL_907>",
|
| 913 |
+
"<SPECIAL_908>",
|
| 914 |
+
"<SPECIAL_909>",
|
| 915 |
+
"<SPECIAL_910>",
|
| 916 |
+
"<SPECIAL_911>",
|
| 917 |
+
"<SPECIAL_912>",
|
| 918 |
+
"<SPECIAL_913>",
|
| 919 |
+
"<SPECIAL_914>",
|
| 920 |
+
"<SPECIAL_915>",
|
| 921 |
+
"<SPECIAL_916>",
|
| 922 |
+
"<SPECIAL_917>",
|
| 923 |
+
"<SPECIAL_918>",
|
| 924 |
+
"<SPECIAL_919>",
|
| 925 |
+
"<SPECIAL_920>",
|
| 926 |
+
"<SPECIAL_921>",
|
| 927 |
+
"<SPECIAL_922>",
|
| 928 |
+
"<SPECIAL_923>",
|
| 929 |
+
"<SPECIAL_924>",
|
| 930 |
+
"<SPECIAL_925>",
|
| 931 |
+
"<SPECIAL_926>",
|
| 932 |
+
"<SPECIAL_927>",
|
| 933 |
+
"<SPECIAL_928>",
|
| 934 |
+
"<SPECIAL_929>",
|
| 935 |
+
"<SPECIAL_930>",
|
| 936 |
+
"<SPECIAL_931>",
|
| 937 |
+
"<SPECIAL_932>",
|
| 938 |
+
"<SPECIAL_933>",
|
| 939 |
+
"<SPECIAL_934>",
|
| 940 |
+
"<SPECIAL_935>",
|
| 941 |
+
"<SPECIAL_936>",
|
| 942 |
+
"<SPECIAL_937>",
|
| 943 |
+
"<SPECIAL_938>",
|
| 944 |
+
"<SPECIAL_939>",
|
| 945 |
+
"<SPECIAL_940>",
|
| 946 |
+
"<SPECIAL_941>",
|
| 947 |
+
"<SPECIAL_942>",
|
| 948 |
+
"<SPECIAL_943>",
|
| 949 |
+
"<SPECIAL_944>",
|
| 950 |
+
"<SPECIAL_945>",
|
| 951 |
+
"<SPECIAL_946>",
|
| 952 |
+
"<SPECIAL_947>",
|
| 953 |
+
"<SPECIAL_948>",
|
| 954 |
+
"<SPECIAL_949>",
|
| 955 |
+
"<SPECIAL_950>",
|
| 956 |
+
"<SPECIAL_951>",
|
| 957 |
+
"<SPECIAL_952>",
|
| 958 |
+
"<SPECIAL_953>",
|
| 959 |
+
"<SPECIAL_954>",
|
| 960 |
+
"<SPECIAL_955>",
|
| 961 |
+
"<SPECIAL_956>",
|
| 962 |
+
"<SPECIAL_957>",
|
| 963 |
+
"<SPECIAL_958>",
|
| 964 |
+
"<SPECIAL_959>",
|
| 965 |
+
"<SPECIAL_960>",
|
| 966 |
+
"<SPECIAL_961>",
|
| 967 |
+
"<SPECIAL_962>",
|
| 968 |
+
"<SPECIAL_963>",
|
| 969 |
+
"<SPECIAL_964>",
|
| 970 |
+
"<SPECIAL_965>",
|
| 971 |
+
"<SPECIAL_966>",
|
| 972 |
+
"<SPECIAL_967>",
|
| 973 |
+
"<SPECIAL_968>",
|
| 974 |
+
"<SPECIAL_969>",
|
| 975 |
+
"<SPECIAL_970>",
|
| 976 |
+
"<SPECIAL_971>",
|
| 977 |
+
"<SPECIAL_972>",
|
| 978 |
+
"<SPECIAL_973>",
|
| 979 |
+
"<SPECIAL_974>",
|
| 980 |
+
"<SPECIAL_975>",
|
| 981 |
+
"<SPECIAL_976>",
|
| 982 |
+
"<SPECIAL_977>",
|
| 983 |
+
"<SPECIAL_978>",
|
| 984 |
+
"<SPECIAL_979>",
|
| 985 |
+
"<SPECIAL_980>",
|
| 986 |
+
"<SPECIAL_981>",
|
| 987 |
+
"<SPECIAL_982>",
|
| 988 |
+
"<SPECIAL_983>",
|
| 989 |
+
"<SPECIAL_984>",
|
| 990 |
+
"<SPECIAL_985>",
|
| 991 |
+
"<SPECIAL_986>",
|
| 992 |
+
"<SPECIAL_987>",
|
| 993 |
+
"<SPECIAL_988>",
|
| 994 |
+
"<SPECIAL_989>",
|
| 995 |
+
"<SPECIAL_990>",
|
| 996 |
+
"<SPECIAL_991>",
|
| 997 |
+
"<SPECIAL_992>",
|
| 998 |
+
"<SPECIAL_993>",
|
| 999 |
+
"<SPECIAL_994>",
|
| 1000 |
+
"<SPECIAL_995>",
|
| 1001 |
+
"<SPECIAL_996>",
|
| 1002 |
+
"<SPECIAL_997>",
|
| 1003 |
+
"<SPECIAL_998>",
|
| 1004 |
+
"<SPECIAL_999>"
|
| 1005 |
+
],
|
| 1006 |
+
"model_max_length": 1000000000000000019884624838656,
|
| 1007 |
+
"pad_token": "<pad>",
|
| 1008 |
+
"processor_class": "PixtralProcessor",
|
| 1009 |
+
"tokenizer_class": "TokenizersBackend"
|
| 1010 |
+
}
|