llama.cpp
10f19c11 - llama : have n_batch default to 512 (#1091)

Commit
2 years ago
llama : have n_batch default to 512 (#1091) * set default n_batch to 512 when using BLAS * spacing * alternate implementation of setting different n_batch for BLAS * set n_batch to 512 for all cases
Author
eiery
Parents
Loading