llama.cpp
CUDA: fix typo in FlashAttention code
#13926
Merged

CUDA: fix typo in FlashAttention code #13926

JohannesGaessler
JohannesGaessler CUDA: fix typo in FlashAttention code
eb420553
github-actions github-actions added Nvidia GPU
github-actions github-actions added ggml
slaren
slaren approved these changes on 2025-05-30
JohannesGaessler JohannesGaessler merged e562eece into master 99 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone