llama.cpp
980b7cd1
- vulkan: force full subgroups for flash attention to fix intel subgroup crash (#17356)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
60 days ago
vulkan: force full subgroups for flash attention to fix intel subgroup crash (#17356)
References
#17356 - vulkan: force full subgroups for flash attention to fix intel subgroup crash
Author
0cc4m
Parents
c49daff5
Loading