escaping prompt for cfg_negative_prompt and consecutive prompts in main with interactive #3623
infill tokens correction
dfeda32a
Merge branch 'ggerganov:master' into master
8bd24b2e
serverinfill tokens correction
6796e745
removing any leading whitespace from infill suffix and removing leead…
377be2f3
removing any leading whitespace from infill suffix and removing leead…
b4046aab
only rm when params.escape, rm space if possible which is added back …
05265607
only rm when params.escape, rm space if possible which is added back …
63ba0b62
Revert "only rm when params.escape, rm space if possible which is add…
003c15bf
Merge branch 'master' of github.com:ggerganov/llama.cpp
fc01dc0c
fix interactive prompt escaping and fix server infill leading space h…
c3a7f848
rm unnecessary bool check
b1b6beff
Merge branch 'master' of github.com:ggerganov/llama.cpp
4a214689
Merge branch 'master' of github.com:ggerganov/llama.cpp
d9dae931
Merge branch 'master' of github.com:ggerganov/llama.cpp
141329f8
Merge branch 'master' of github.com:ggerganov/llama.cpp
35177291
Merge branch 'ggerganov:master' into master
9b608da8
Merge branch 'ggerganov:master' into master
8eef9583
process escapes for neg prompt and interactive consec prompts
ee652b2a
Merge branch 'ggerganov:master' into master
52a77674
staviq
commented
on 2023-10-14
vvhg1
commented
on 2023-10-18
removed unneccessary static string escape
02ac367d
Merge branch 'master' of github.com:vvhg1/llama.cpp
f0d3971d
ggerganov
approved these changes
on 2023-10-19
Merge branch 'master' of github.com:ggerganov/llama.cpp
97d67e8a
ggerganov
merged
d3956aea
into master 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub