llama.cpp
05cd6e50
- server : recognize cache_prompt parameter in OAI API (#4347)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
server : recognize cache_prompt parameter in OAI API (#4347)
References
#4347 - server : recognize cache_prompt parameter in OAI API
Author
ggerganov
Parents
caa92492
Loading