chat-ui
eb071be4
- Fix prompt caching on llama.cpp endpoints (#920)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
Fix prompt caching on llama.cpp endpoints (#920) Explicitly enable prompt caching on llama.cpp endpoints Co-authored-by: Nathan Sarrazin <sarrazin.nathan@gmail.com>
References
#920 - Fix prompt caching on llama.cpp endpoints
Author
reversebias
Parents
2edb2788
Loading