llama.cpp
server: fix passing prompt as tokens
#5955
Merged

Loading