Anticipated Availability of LLM-Compressor Format Models (W4A16/W8A16)
#4
by
X-SZM
- opened
While quantization formats such as GGUF and AWQ are already available in the community, models in llm-compressor formats like W4A16 and W8A16 remain notably scarce. These formats demonstrate significant advantages in quantization effectiveness and reduced precision loss. Given that quantizing models using llm-compressor formats requires data calibration—a process challenging for general users to complete—it would be beneficial for organized community groups to share models in W4A16 and W8A16 formats. Such contributions would be highly anticipated.