feihu.hf
commited on
Commit
·
90862c4
1
Parent(s):
ed70c9b
update README
Browse files
README.md
CHANGED
|
@@ -46,7 +46,7 @@ We advise you to clone [`llama.cpp`](https://github.com/ggerganov/llama.cpp) and
|
|
| 46 |
In the following demonstration, we assume that you are running commands under the repository `llama.cpp`.
|
| 47 |
|
| 48 |
```shell
|
| 49 |
-
./llama-cli -hf Qwen/Qwen3-1.7B:Q8_0 --jinja --color -ngl 99 -fa -sm row --temp 0.6 --top-k 20 --top-p 0.95 --min-p 0 --presence-penalty 1.5 -c 40960 -n 32768 --no-context-shift
|
| 50 |
```
|
| 51 |
|
| 52 |
### ollama
|
|
|
|
| 46 |
In the following demonstration, we assume that you are running commands under the repository `llama.cpp`.
|
| 47 |
|
| 48 |
```shell
|
| 49 |
+
./llama-cli -hf Qwen/Qwen3-1.7B-GGUF:Q8_0 --jinja --color -ngl 99 -fa -sm row --temp 0.6 --top-k 20 --top-p 0.95 --min-p 0 --presence-penalty 1.5 -c 40960 -n 32768 --no-context-shift
|
| 50 |
```
|
| 51 |
|
| 52 |
### ollama
|
params
CHANGED
|
@@ -9,5 +9,6 @@
|
|
| 9 |
"presence_penalty" : 1.5,
|
| 10 |
"top_k" : 20,
|
| 11 |
"top_p" : 0.95,
|
| 12 |
-
"num_predict" : 32768
|
|
|
|
| 13 |
}
|
|
|
|
| 9 |
"presence_penalty" : 1.5,
|
| 10 |
"top_k" : 20,
|
| 11 |
"top_p" : 0.95,
|
| 12 |
+
"num_predict" : 32768,
|
| 13 |
+
"num_ctx": 40960
|
| 14 |
}
|