llama.cpp
10f19c11
- llama : have n_batch default to 512 (#1091)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
llama : have n_batch default to 512 (#1091) * set default n_batch to 512 when using BLAS * spacing * alternate implementation of setting different n_batch for BLAS * set n_batch to 512 for all cases
Author
eiery
Parents
7e312f16
Loading