llama.cpp
0c74b043
- vulkan: fix NaN issue in flash attention shader (#12776)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
216 days ago
vulkan: fix NaN issue in flash attention shader (#12776) Use -FLT_MAX/2 rather than -inf as the initial value for computing the maximum.
References
#12776 - vulkan: fix NaN issue in flash attention shader
Author
jeffbolznv
Parents
80b717d4
Loading