mradermacher commited on
Commit
f0ec5c5
·
verified ·
1 Parent(s): bba6dd2

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -1
README.md CHANGED
@@ -37,7 +37,7 @@ static quants of https://huggingface.co/INSAIT-Institute/MamayLM-Gemma-3-4B-IT-v
37
 
38
  ***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#MamayLM-Gemma-3-4B-IT-v1.0-GGUF).***
39
 
40
- weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
41
  ## Usage
42
 
43
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
@@ -52,6 +52,18 @@ more details, including on how to concatenate multi-part files.
52
  |:-----|:-----|--------:|:------|
53
  | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.mmproj-Q8_0.gguf) | mmproj-Q8_0 | 0.7 | multi-modal supplement |
54
  | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.mmproj-f16.gguf) | mmproj-f16 | 1.0 | multi-modal supplement |
 
 
 
 
 
 
 
 
 
 
 
 
55
 
56
  Here is a handy graph by ikawrakow comparing some lower-quality quant
57
  types (lower is better):
 
37
 
38
  ***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#MamayLM-Gemma-3-4B-IT-v1.0-GGUF).***
39
 
40
+ weighted/imatrix quants are available at https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-i1-GGUF
41
  ## Usage
42
 
43
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
 
52
  |:-----|:-----|--------:|:------|
53
  | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.mmproj-Q8_0.gguf) | mmproj-Q8_0 | 0.7 | multi-modal supplement |
54
  | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.mmproj-f16.gguf) | mmproj-f16 | 1.0 | multi-modal supplement |
55
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q2_K.gguf) | Q2_K | 1.8 | |
56
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q3_K_S.gguf) | Q3_K_S | 2.0 | |
57
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q3_K_M.gguf) | Q3_K_M | 2.2 | lower quality |
58
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q3_K_L.gguf) | Q3_K_L | 2.3 | |
59
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.IQ4_XS.gguf) | IQ4_XS | 2.4 | |
60
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q4_K_S.gguf) | Q4_K_S | 2.5 | fast, recommended |
61
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q4_K_M.gguf) | Q4_K_M | 2.6 | fast, recommended |
62
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q5_K_S.gguf) | Q5_K_S | 2.9 | |
63
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q5_K_M.gguf) | Q5_K_M | 2.9 | |
64
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q6_K.gguf) | Q6_K | 3.3 | very good quality |
65
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.Q8_0.gguf) | Q8_0 | 4.2 | fast, best quality |
66
+ | [GGUF](https://huggingface.co/mradermacher/MamayLM-Gemma-3-4B-IT-v1.0-GGUF/resolve/main/MamayLM-Gemma-3-4B-IT-v1.0.f16.gguf) | f16 | 7.9 | 16 bpw, overkill |
67
 
68
  Here is a handy graph by ikawrakow comparing some lower-quality quant
69
  types (lower is better):