llama.cpp
d8919424 - CUDA: fix FlashAttention on Turing (#13415)

Commit
220 days ago
CUDA: fix FlashAttention on Turing (#13415)
Parents
Loading