llama.cpp
[SYCL] Enhance flash-attention performance
#21185
Merged

[SYCL] Enhance flash-attention performance #21185

ggerganov merged 1 commit into ggml-org:master from arthw:en_fattn
arthw
arthw enhance fattn perf
50884283
arthw arthw requested a review 6 days ago
arthw
github-actions github-actions added ggml
github-actions github-actions added SYCL
ggerganov ggerganov merged 62278ced into master 5 days ago
encodatamHirmer
NeoZhangJianyu
encodatamHirmer
NeoZhangJianyu
NeoZhangJianyu
NeoZhangJianyu
encodatamHirmer
NeoZhangJianyu
encodatamHirmer
NeoZhangJianyu
NeoZhangJianyu
encodatamHirmer

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone