llama.cpp
d4c19c0f
- server : accept extra_context for the infill endpoint (#9874)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
server : accept extra_context for the infill endpoint (#9874) * server : accept extra_context for the infill endpoint ggml-ci * server : update readme [no ci] * server : use repo-level FIM pattern if possible ggml-ci
References
#9874 - server : accept extra_context for the infill endpoint
Author
ggerganov
Parents
c7181bd2
Loading