--- license: apache-2.0 base_model: mistralai/Ministral-3-3B-Reasoning-2512 pipeline_tag: image-text-to-text --- # Ministral-3-3B-Reasoning-2512-GGUF This model is converted from [mistralai/Ministral-3-3B-Reasoning-2512](https://huggingface.co/mistralai/Ministral-3-3B-Reasoning-2512) to GGUF using `convert_hf_to_gguf.py` To use it: ``` llama-server -hf ggml-org/Ministral-3-3B-Reasoning-2512-GGUF ```