llama.cpp
vulkan: Revert forced full subgroup for FlashAttention
#18831
Merged

vulkan: Revert forced full subgroup for FlashAttention #18831

rillomas
rillomas Revert "vulkan: force full subgroups for flash attention to fix intel…
8f689c2b
0cc4m
github-actions github-actions added Vulkan
github-actions github-actions added ggml
rillomas Merge branch 'master' into revert-fa-full-subgroup
4942ab6f
rillomas
rillomas rillomas marked this pull request as ready for review 8 days ago
rillomas rillomas requested a review from 0cc4m 0cc4m 8 days ago
jeffbolznv
jeffbolznv
jeffbolznv approved these changes on 2026-01-17
rillomas
0cc4m
0cc4m approved these changes on 2026-01-21
0cc4m 0cc4m merged 067b8d7a into master 3 days ago
rillomas rillomas deleted the revert-fa-full-subgroup branch 2 days ago
savvadesogle

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone