llama.cpp
Add single-client multi-prompt support
#4232
Merged

Add single-client multi-prompt support #4232

ziedbha
ziedbha * add multiprompt support
09562678
ziedbha * cleanup
ff67c764
ziedbha * more cleanup
5906fb44
ziedbha
ziedbha commented on 2023-11-27
ziedbha
ziedbha * remove atomicity of id_gen, and change lock_guard to unique_lock on…
e2ee3776
ziedbha
ggerganov
ggerganov commented on 2023-11-29
ziedbha * remove all references to mutex_multitasks
38ce5d02
ziedbha ziedbha requested a review from ggerganov ggerganov 2 years ago
ziedbha ziedbha requested a review from cebtenzzre cebtenzzre 2 years ago
cebtenzzre
cebtenzzre commented on 2023-11-29
ziedbha Update examples/server/server.cpp
09da4b14
ziedbha Update examples/server/server.cpp
0e1a5aa5
ziedbha Update examples/server/server.cpp
14785e11
ziedbha Update examples/server/server.cpp
3b371e10
cebtenzzre
cebtenzzre commented on 2023-11-30
Green-Sky
Green-Sky commented on 2023-11-30
ggerganov
ggerganov approved these changes on 2023-11-30
ziedbha * change to set
0f175a60
ziedbha * merge with base
73df0c43
ggerganov ggerganov merged f43f0936 into master 2 years ago
TheMatten
cebtenzzre
shibe2
ziedbha

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone