llama.cpp
OpenCL: add attention sinks support for FA kernels
#15706
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
OpenCL: add attention sinks support for FA kernels
#15706
lhez
merged 1 commit into
ggml-org:master
from
rmatif:add-attn-sinks-fa
add attn sinks support for FA kernels
b27e95a7
rmatif
requested a review
from
max-krasnyansky
68 days ago
rmatif
requested a review
from
lhez
68 days ago
github-actions
added
ggml
github-actions
added
OpenCL
lhez
approved these changes on 2025-09-02
lhez
merged
97669e40
into master
66 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
lhez
max-krasnyansky
Assignees
No one assigned
Labels
ggml
OpenCL
Milestone
No milestone
Login to write a write a comment.
Login via GitHub