llama.cpp
vulkan: Revert forced full subgroup for FlashAttention
#18831
Open

vulkan: Revert forced full subgroup for FlashAttention #18831

rillomas wants to merge 2 commits into ggml-org:master from rillomas:revert-fa-full-subgroup
rillomas
rillomas Revert "vulkan: force full subgroups for flash attention to fix intel…
8f689c2b
0cc4m
github-actions github-actions added Vulkan
github-actions github-actions added ggml
rillomas Merge branch 'master' into revert-fa-full-subgroup
4942ab6f
rillomas
rillomas rillomas marked this pull request as ready for review 5 days ago
rillomas rillomas requested a review from 0cc4m 0cc4m 5 days ago
jeffbolznv
jeffbolznv
jeffbolznv approved these changes on 2026-01-17
rillomas

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone