Update README.md
Browse files
README.md
CHANGED
|
@@ -39,15 +39,15 @@ SEA-LION stands for _Southeast Asian Languages In One Network_.
|
|
| 39 |
This repo contains `GGUF` format model files for [aisingapore/Gemma-SEA-LION-v3-9B-IT](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT).
|
| 40 |
|
| 41 |
#### Model Weights Included in this repository:
|
| 42 |
-
- [
|
| 43 |
-
- [
|
| 44 |
-
- [
|
| 45 |
-
- [
|
| 46 |
-
- [
|
| 47 |
-
- [
|
| 48 |
-
- [
|
| 49 |
-
- [
|
| 50 |
-
- [
|
| 51 |
|
| 52 |
### Caveats
|
| 53 |
It is important for users to be aware that our model exhibits certain limitations that warrant consideration. Like many LLMs, the model can hallucinate and occasionally generates irrelevant content, introducing fictional elements that are not grounded in the provided context. Users should also exercise caution in interpreting and validating the model's responses due to the potential inconsistencies in its reasoning.
|
|
|
|
| 39 |
This repo contains `GGUF` format model files for [aisingapore/Gemma-SEA-LION-v3-9B-IT](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT).
|
| 40 |
|
| 41 |
#### Model Weights Included in this repository:
|
| 42 |
+
- [Gemma-SEA-LION-v3-9B-IT-F16](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-F16.gguf)
|
| 43 |
+
- [Gemma-SEA-LION-v3-9B-IT-Q2_K](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q2_K.gguf)
|
| 44 |
+
- [Gemma-SEA-LION-v3-9B-IT-Q3_K_M](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q3_K_M.gguf)
|
| 45 |
+
- [Gemma-SEA-LION-v3-9B-IT-Q4_0](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q4_0.gguf)
|
| 46 |
+
- [Gemma-SEA-LION-v3-9B-IT-Q4_K_M](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q4_K_M.gguf)
|
| 47 |
+
- [Gemma-SEA-LION-v3-9B-IT-Q5_0](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q5_0.gguf)
|
| 48 |
+
- [Gemma-SEA-LION-v3-9B-IT-Q5_K_M](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q5_K_M.gguf)
|
| 49 |
+
- [Gemma-SEA-LION-v3-9B-IT-Q6_K](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q6_K.gguf)
|
| 50 |
+
- [Gemma-SEA-LION-v3-9B-IT-Q8_0](https://huggingface.co/aisingapore/Gemma-SEA-LION-v3-9B-IT-GGUF/blob/main/Gemma-SEA-LION-v3-9B-IT-Q8_0.gguf)
|
| 51 |
|
| 52 |
### Caveats
|
| 53 |
It is important for users to be aware that our model exhibits certain limitations that warrant consideration. Like many LLMs, the model can hallucinate and occasionally generates irrelevant content, introducing fictional elements that are not grounded in the provided context. Users should also exercise caution in interpreting and validating the model's responses due to the potential inconsistencies in its reasoning.
|