llama.cpp
a10b36c9 - llama : refactor kv cache guard (#12695)

Commit
217 days ago
llama : refactor kv cache guard (#12695) * llama : refactor kv cache guard ggml-ci * cont : fix comment [no ci] * llama : fix kv_cache restore logic ggml-ci * context : simplify kv cache updates ggml-ci * cont : better name [no ci] * llama : fix llama_decode return code when could not find KV slot ggml-ci * context : change log err -> warn [no ci] * kv-cache : add comment + warning
Author
Parents
Loading