llama.cpp
94112882
- main : evaluate tokens in batches after swapping context (#1014)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
main : evaluate tokens in batches after swapping context (#1014) * examples : evaluate tokens in batches after swapping context * Update examples/main/main.cpp --------- Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
References
#1014 - examples : evaluate tokens in batches after swapping context
Author
grencez
Parents
8687c1f2
Loading