llama.cpp
d4c19c0f - server : accept extra_context for the infill endpoint (#9874)

Commit
1 year ago
server : accept extra_context for the infill endpoint (#9874) * server : accept extra_context for the infill endpoint ggml-ci * server : update readme [no ci] * server : use repo-level FIM pattern if possible ggml-ci
Author
Parents
Loading