chat-ui
Fix prompt caching on llama.cpp endpoints
#920
Merged

Fix prompt caching on llama.cpp endpoints #920

reversebias
reversebias Explicitly enable prompt caching on llama.cpp endpoints
0b3e42a5
nsarrazin
nsarrazin approved these changes on 2024-03-11
nsarrazin Merge branch 'main' into fix/llama_cpp_prompt_caching
79549238
nsarrazin nsarrazin merged eb071be4 into main 2 years ago
nsarrazin

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone