llama.cpp
vulkan: fix coopmat2 flash attention for non-contiguous inputs
#11281
Merged

vulkan: fix coopmat2 flash attention for non-contiguous inputs #11281

jeffbolznv
jeffbolznv vulkan: fix coopmat2 flash attention for non-contiguous inputs
b2d861ba
jeffbolznv jeffbolznv requested a review from slaren slaren 266 days ago
jeffbolznv jeffbolznv requested a review from 0cc4m 0cc4m 266 days ago
github-actions github-actions added testing
github-actions github-actions added Vulkan
github-actions github-actions added ggml
0cc4m
0cc4m approved these changes on 2025-01-18
0cc4m 0cc4m merged 44e18ef9 into master 266 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone