llama.cpp
[SYCL] Enhance flash-attention performance
#21185
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
[SYCL] Enhance flash-attention performance
#21185
ggerganov
merged 1 commit into
ggml-org:master
from
arthw:en_fattn
enhance fattn perf
50884283
arthw
requested a review
6 days ago
github-actions
added
ggml
github-actions
added
SYCL
ggerganov
merged
62278ced
into master
5 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
No reviews
Assignees
No one assigned
Labels
ggml
SYCL
Milestone
No milestone
Login to write a write a comment.
Login via GitHub