llama.cpp
d3699366
- fix: Update recurrent cache for changes to remove intermediate kv_cache interface
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
196 days ago
fix: Update recurrent cache for changes to remove intermediate kv_cache interface Branch: HybridRecurrentCache Signed-off-by: Gabe Goodhart <ghart@us.ibm.com>
References
#13979 - Hybrid recurrent cache
Author
gabe-l-hart
Committer
gabe-l-hart
Parents
a9b5fe98
Loading