llama.cpp
21c84b5d
- CUDA: fix Volta FlashAttention logic (#11615)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
275 days ago
CUDA: fix Volta FlashAttention logic (#11615)
References
#11615 - CUDA: fix Volta FlashAttention logic
Author
JohannesGaessler
Parents
d92cb67e
Loading