llama.cpp
f40a80b4 - support bf16 and quantized type (#20803)

Commit
3 days ago
support bf16 and quantized type (#20803)
Author
Parents
Loading