llama.cpp
d8919424 - CUDA: fix FlashAttention on Turing (#13415)

Commit
1 year ago
CUDA: fix FlashAttention on Turing (#13415)
Parents
Loading