llama.cpp
94112882 - main : evaluate tokens in batches after swapping context (#1014)

Commit
2 years ago
main : evaluate tokens in batches after swapping context (#1014) * examples : evaluate tokens in batches after swapping context * Update examples/main/main.cpp --------- Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
Author
Parents
Loading