llama.cpp
vulkan: fix NaN issue in flash attention shader
#12776
Merged

vulkan: fix NaN issue in flash attention shader #12776

0cc4m merged 1 commit into ggml-org:master from jeffbolznv:fa_nan
jeffbolznv
jeffbolznv vulkan: fix NaN issue in flash attention shader
ee66c160
jeffbolznv jeffbolznv requested a review from 0cc4m 0cc4m 253 days ago
github-actions github-actions added Vulkan
github-actions github-actions added ggml
0cc4m
0cc4m approved these changes on 2025-04-06
0cc4m 0cc4m merged 0c74b043 into master 253 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone