llama.cpp
f40a80b4
- support bf16 and quantized type (#20803)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
3 days ago
support bf16 and quantized type (#20803)
References
#20803 - [SYCL] Support bf16 and quantized type of MUL_MAT
Author
arthw
Parents
db9d8aa4
Loading