GGUF
vllm
mistral-common
unsloth
conversational

Native function calls not working.

#2
by StatusQuo209 - opened

Tried Ollama and LMstudio. Native function calling is not working for me with either backend. I am using openwebui. Default mode does work.

Anyone found a way to get this to work?

I am using llama.cpp with the --jinja option, the sampling params "temp 0.7 repeat-penalty 1.0 min-p 0.01 top-k -1 top-p 0.95" and the recommended system prompt, and native tool calling works fine in openwebui. Are you using the template?
And without the system prompt, this model only talks complete rubbish for me...

Sign up or log in to comment