GGUF
conversational
chtxxxxx commited on
Commit
24ad0b4
·
verified ·
1 Parent(s): 8b7b2dd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -9
README.md CHANGED
@@ -39,15 +39,15 @@ SEA-LION stands for _Southeast Asian Languages In One Network_.
39
  This repo contains `GGUF` format model files for [aisingapore/Gemma-SEA-LION-v3-9B-IT](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT).
40
 
41
  #### Model Weights Included in this repository:
42
- - [gemma2-9b-cpt-sea-lionv3-instruct-F16](https://huggingface.co/aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf/blob/main/gemma2-9b-cpt-sea-lionv3-instruct-F16.gguf)
43
- - [gemma2-9b-cpt-sea-lionv3-instruct-Q2_K](https://huggingface.co/aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf/blob/main/gemma2-9b-cpt-sea-lionv3-instruct-Q2_K.gguf)
44
- - [gemma2-9b-cpt-sea-lionv3-instruct-Q3_K_M](https://huggingface.co/aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf/blob/main/gemma2-9b-cpt-sea-lionv3-instruct-Q3_K_M.gguf)
45
- - [gemma2-9b-cpt-sea-lionv3-instruct-Q4_0](https://huggingface.co/aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf/blob/main/gemma2-9b-cpt-sea-lionv3-instruct-Q4_0.gguf)
46
- - [gemma2-9b-cpt-sea-lionv3-instruct-Q4_K_M](https://huggingface.co/aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf/blob/main/gemma2-9b-cpt-sea-lionv3-instruct-Q4_K_M.gguf)
47
- - [gemma2-9b-cpt-sea-lionv3-instruct-Q5_0](https://huggingface.co/aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf/blob/main/gemma2-9b-cpt-sea-lionv3-instruct-Q5_0.gguf)
48
- - [gemma2-9b-cpt-sea-lionv3-instruct-Q5_K_M](https://huggingface.co/aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf/blob/main/gemma2-9b-cpt-sea-lionv3-instruct-Q5_K_M.gguf)
49
- - [gemma2-9b-cpt-sea-lionv3-instruct-Q6_K](https://huggingface.co/aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf/blob/main/gemma2-9b-cpt-sea-lionv3-instruct-Q6_K.gguf)
50
- - [gemma2-9b-cpt-sea-lionv3-instruct-Q8_0](https://huggingface.co/aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf/blob/main/gemma2-9b-cpt-sea-lionv3-instruct-Q8_0.gguf)
51
 
52
  ### Caveats
53
  It is important for users to be aware that our model exhibits certain limitations that warrant consideration. Like many LLMs, the model can hallucinate and occasionally generates irrelevant content, introducing fictional elements that are not grounded in the provided context. Users should also exercise caution in interpreting and validating the model's responses due to the potential inconsistencies in its reasoning.
 
39
  This repo contains `GGUF` format model files for [aisingapore/Gemma-SEA-LION-v3-9B-IT](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT).
40
 
41
  #### Model Weights Included in this repository:
42
+ - [Gemma-SEA-LION-v3-9B-IT-F16](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-F16.gguf)
43
+ - [Gemma-SEA-LION-v3-9B-IT-Q2_K](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q2_K.gguf)
44
+ - [Gemma-SEA-LION-v3-9B-IT-Q3_K_M](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q3_K_M.gguf)
45
+ - [Gemma-SEA-LION-v3-9B-IT-Q4_0](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q4_0.gguf)
46
+ - [Gemma-SEA-LION-v3-9B-IT-Q4_K_M](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q4_K_M.gguf)
47
+ - [Gemma-SEA-LION-v3-9B-IT-Q5_0](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q5_0.gguf)
48
+ - [Gemma-SEA-LION-v3-9B-IT-Q5_K_M](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q5_K_M.gguf)
49
+ - [Gemma-SEA-LION-v3-9B-IT-Q6_K](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q6_K.gguf)
50
+ - [Gemma-SEA-LION-v3-9B-IT-Q8_0](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q8_0.gguf)
51
 
52
  ### Caveats
53
  It is important for users to be aware that our model exhibits certain limitations that warrant consideration. Like many LLMs, the model can hallucinate and occasionally generates irrelevant content, introducing fictional elements that are not grounded in the provided context. Users should also exercise caution in interpreting and validating the model's responses due to the potential inconsistencies in its reasoning.