llama.cpp
68e210b3 - server : enable continuous batching by default (#6231)

Commit
1 year ago
server : enable continuous batching by default (#6231)
Author
Parents
Loading