llama.cpp
vulkan: fix coopmat2 flash attention for non-contiguous inputs
#11281
Merged

Loading