mirror of
https://git.adityakumar.xyz/llama.cpp.git
synced 2024-11-09 23:29:44 +00:00
Add chat.sh script
This commit is contained in:
parent
6a612959e1
commit
dc6a845b85
2 changed files with 10 additions and 1 deletions
|
@ -179,8 +179,11 @@ In this mode, you can always interrupt generation by pressing Ctrl+C and enter o
|
|||
|
||||
Here is an example few-shot interaction, invoked with the command
|
||||
```
|
||||
./main -m ./models/13B/ggml-model-q4_0.bin -n 256 --repeat_penalty 1.0 --color -i -r "User:" -f prompts/chat-with-bob.txt
|
||||
# default arguments using 7B model
|
||||
./chat.sh
|
||||
|
||||
# custom arguments using 13B model
|
||||
./main -m ./models/13B/ggml-model-q4_0.bin -n 256 --repeat_penalty 1.0 --color -i -r "User:" -f prompts/chat-with-bob.txt
|
||||
```
|
||||
Note the use of `--color` to distinguish between user input and generated text.
|
||||
|
||||
|
|
6
chat.sh
Executable file
6
chat.sh
Executable file
|
@ -0,0 +1,6 @@
|
|||
#!/bin/bash
|
||||
#
|
||||
# Temporary script - will be removed in the future
|
||||
#
|
||||
|
||||
./main -m ./models/7B/ggml-model-q4_0.bin -n 256 --repeat_penalty 1.0 --color -i -r "User:" -f prompts/chat-with-bob.txt
|
Loading…
Reference in a new issue