| license: apache-2.0 | |
| base_model: mistralai/Ministral-3-3B-Instruct-2512-BF16 | |
| # Ministral-3-3B-Instruct-2512-GGUF | |
| This model is converted from [mistralai/Ministral-3-3B-Instruct-2512-BF16](https://huggingface.co/mistralai/Ministral-3-3B-Instruct-2512-BF16) to GGUF using `convert_hf_to_gguf.py` | |
| To use it: | |
| ``` | |
| llama-server -hf ggml-org/Ministral-3-3B-Instruct-2512-GGUF | |
| ``` | |