llama.cpp
81040f10
- llama : do not allocate KV cache for "vocab_only == true" (#682)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
llama : do not allocate KV cache for "vocab_only == true" (#682) Fixes sanitizer CI
References
#682 - Be nice to CI machines by not allocating buffers
Author
sw
Parents
c4f89d8d
Loading