llama.cpp
server : recognize cache_prompt parameter in OAI API
#4347
Merged

server : recognize cache_prompt parameter in OAI API #4347

ggerganov merged 1 commit into master from gg/server-oai-cache-prompt
ggerganov
ggerganov server : recognize cache_prompt parameter in OAI API
ef455cb1
ggerganov ggerganov added need feedback
ggerganov ggerganov merged 05cd6e50 into master 2 years ago
ggerganov ggerganov deleted the gg/server-oai-cache-prompt branch 2 years ago

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone