llama.cpp
ab9851f1 - vulkan: allow using fp16 in coopmat1 flash attention shader

Commit
2 days ago
vulkan: allow using fp16 in coopmat1 flash attention shader
Author
Committer
Parents
Loading