|
|
--- |
|
|
license: mit |
|
|
base_model: zai-org/GLM-4.1V-9B-Thinking |
|
|
base_model_relation: quantized |
|
|
quantized_by: turboderp |
|
|
tags: |
|
|
- exl3 |
|
|
--- |
|
|
|
|
|
EXL3 quants of [GLM-4.1V-9B-Thinking](https://huggingface.co/zai-org/GLM-4.1V-9B-Thinking) |
|
|
|
|
|
⚠️ Requires ExLlamaV3 v0.0.15 (or v0.0.14 `dev` branch) |
|
|
|
|
|
[2.00 bits per weight](https://huggingface.co/turboderp/GLM-4.1V-9B-Thinking-exl3/tree/2.00bpw) |
|
|
[2.50 bits per weight](https://huggingface.co/turboderp/GLM-4.1V-9B-Thinking-exl3/tree/2.50bpw) |
|
|
[3.00 bits per weight](https://huggingface.co/turboderp/GLM-4.1V-9B-Thinking-exl3/tree/3.00bpw) |
|
|
[3.50 bits per weight](https://huggingface.co/turboderp/GLM-4.1V-9B-Thinking-exl3/tree/3.50bpw) |
|
|
[4.00 bits per weight](https://huggingface.co/turboderp/GLM-4.1V-9B-Thinking-exl3/tree/4.00bpw) |
|
|
[5.00 bits per weight](https://huggingface.co/turboderp/GLM-4.1V-9B-Thinking-exl3/tree/5.00bpw) |
|
|
[6.00 bits per weight](https://huggingface.co/turboderp/GLM-4.1V-9B-Thinking-exl3/tree/6.00bpw) |
|
|
|