llama.cpp
CUDA: fix race conditions in FlashAttention kernels
#13438
Merged

CUDA: fix race conditions in FlashAttention kernels #13438

JohannesGaessler
JohannesGaessler CUDA: fix race conditions FlashAttention kernels
e61d8f01
github-actions github-actions added Nvidia GPU
github-actions github-actions added ggml
JohannesGaessler JohannesGaessler changed the title CUDA: fix race conditions FlashAttention kernels CUDA: fix race conditions in FlashAttention kernels 126 days ago
CISC
CISC approved these changes on 2025-05-10
JohannesGaessler JohannesGaessler merged 0208355f into master 126 days ago
Panchovix
JohannesGaessler

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone