turboderp's picture
Create README.md
a798dc7 verified
metadata
license: mit
base_model: zai-org/GLM-4.1V-9B-Thinking
base_model_relation: quantized
quantized_by: turboderp
tags:
  - exl3

EXL3 quants of GLM-4.1V-9B-Thinking

⚠️ Requires ExLlamaV3 v0.0.15 (or v0.0.14 dev branch)

2.00 bits per weight
2.50 bits per weight
3.00 bits per weight
3.50 bits per weight
4.00 bits per weight
5.00 bits per weight
6.00 bits per weight