llama.cpp
61795789
- batch : require non-coupled batch with sequential split_equal
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
107 days ago
batch : require non-coupled batch with sequential split_equal ggml-ci
References
gg/llama-high-throughput-save2
Author
ggerganov
Parents
5eb1a88d
Loading