llama.cpp
067b8d7a
- Revert "vulkan: force full subgroups for flash attention to fix intel subgroup crash (#17356)" (#18831)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
28 days ago
Revert "vulkan: force full subgroups for flash attention to fix intel subgroup crash (#17356)" (#18831) This reverts commit 980b7cd17e055c8c587f79ffda7eb4fddf405566.
References
#18831 - vulkan: Revert forced full subgroup for FlashAttention
Author
rillomas
Parents
50b7f076
Loading