llama.cpp
vulkan: Implement split_k for coopmat2 flash attention.
#12627
Merged

vulkan: Implement split_k for coopmat2 flash attention. #12627

jeffbolznv
jeffbolznv jeffbolznv requested a review from 0cc4m 0cc4m 266 days ago
github-actions github-actions added testing
github-actions github-actions added Vulkan
github-actions github-actions added ggml
0cc4m
0cc4m approved these changes on 2025-04-02
jeffbolznv jeffbolznv force pushed 260 days ago
jeffbolznv
jeffbolznv vulkan: Implement split_k for coopmat2 flash attention.
cc644fb6
jeffbolznv jeffbolznv force pushed to cc644fb6 260 days ago
jeffbolznv jeffbolznv merged f01bd023 into master 260 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone