Vibe and Devstral-2
#9
by
Cypherfox
- opened
Greetings,
I've been trying to use vibe with a VLLM instance running Devstral-2 and it's not working for me. I keep getting:
Error: API error from local-mistral (model: mistralai/Devstral-Small-2-24B-Instruct-2512): Streamed completion returned no chunks
My configuration looks relatively normal, but includes:
[[providers]]
name = "local-mistral"
api_base = "http://{myhost}.{example}.org:8000/v1"
api_key_env_var = ""
api_style = "openai"
backend = "generic"
and
[[models]]
name = "mistralai/Devstral-Small-2-24B-Instruct-2512"
provider = "local-mistral"
alias = "devstral-2"
temperature = 0.2
input_price = 0.4
output_price = 2.0
Am I doing something wrong?
-- Morgan