chat-ui
eb071be4 - Fix prompt caching on llama.cpp endpoints (#920)

Commit
1 year ago
Fix prompt caching on llama.cpp endpoints (#920) Explicitly enable prompt caching on llama.cpp endpoints Co-authored-by: Nathan Sarrazin <sarrazin.nathan@gmail.com>
Author
Parents
Loading