llama.cpp
ab9851f1
- vulkan: allow using fp16 in coopmat1 flash attention shader
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 days ago
vulkan: allow using fp16 in coopmat1 flash attention shader
Author
0cc4m
Committer
0cc4m
Parents
1946e46f
Loading