llama.cpp
7a74dee6 - llama : temporary disable Q6_K output quantization (#1711)

Commit
2 years ago
llama : temporary disable Q6_K output quantization (#1711)
Author
Parents
Loading