Will this support Ollama soon?
#1
by
kingo55
- opened
Just noticed I can't access this model using Ollama.
Just noticed I can't access this model using Ollama.
I think it's because it's a sharded GGUF meaning you have to merge it for now. Have you tried using llama.cpp? https://docs.unsloth.ai/basics/qwen3-how-to-run-and-fine-tune#running-qwen3-235b-a22b