llama.cpp
vulkan: force full subgroups for flash attention to fix intel subgroup crash
#17356
Merged

vulkan: force full subgroups for flash attention to fix intel subgroup crash #17356

0cc4m merged 1 commit into master from 0cc4m/vulkan-flash-attn-intel-fix
0cc4m
0cc4m vulkan: force full subgroups for flash attention to fix intel subgrou…
155a8292
github-actions github-actions added Vulkan
github-actions github-actions added ggml
jeffbolznv
0cc4m
jeffbolznv
jeffbolznv approved these changes on 2025-11-18
0cc4m 0cc4m merged 980b7cd1 into master 147 days ago
0cc4m 0cc4m deleted the 0cc4m/vulkan-flash-attn-intel-fix branch 147 days ago
rillomas

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone