llama.cpp
0208355f
- CUDA: fix race conditions FlashAttention kernels (#13438)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
124 days ago
CUDA: fix race conditions FlashAttention kernels (#13438)
References
#13438 - CUDA: fix race conditions in FlashAttention kernels
Author
JohannesGaessler
Parents
d2a4ef05
Loading