llama.cpp
c9c6e01d
- vulkan: Add VK_NV_cooperative_matrix2 support for mul_mat and flash attention (#10206)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
282 days ago
vulkan: Add VK_NV_cooperative_matrix2 support for mul_mat and flash attention (#10206)
References
#10206 - vulkan: Add VK_NV_cooperative_matrix2 support for mul_mat and FlashAttention2
Author
jeffbolznv
Parents
6fe62478
Loading