'Make knowledge free for everyone'
Original FP8 weigths dequantized with: https://github.com/csabakecskemeti/ministral-3_dequantizer_fp8-bf16
Quantized version of: mistralai/Ministral-3-14B-Instruct-2512
![]()
- Downloads last month
- -
Hardware compatibility
Log In
to view the estimation
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit
16-bit
Model tree for DevQuasar/mistralai.Ministral-3-14B-Instruct-2512-GGUF
Base model
mistralai/Ministral-3-14B-Base-2512
Quantized
mistralai/Ministral-3-14B-Instruct-2512
