About
GGUF Quantizations of https://huggingface.co/01-ai/Yi-1.5-9B
Provided Quantizations
| Link | Type |
|---|---|
| GGUF | Q2_K |
| GGUF | Q3_K_S |
| GGUF | Q3_K_M |
| GGUF | Q3_K_L |
| GGUF | Q4_0 |
| GGUF | Q4_K_S |
| GGUF | Q4_K_M |
| GGUF | Q5_0 |
| GGUF | Q5_K_S |
| GGUF | Q5_K_M |
| GGUF | Q6_K |
| GGUF | Q8_0 |
In a circular citation, I borrowed the format of this file from https://huggingface.co/mradermacher/copy_of_wildjailbreak_13-GGUF.
- Downloads last month
- 26
Hardware compatibility
Log In
to view the estimation
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for larenspear/Yi-1.5-9B-Chat-GGUF
Base model
01-ai/Yi-1.5-9B