llama.cpp
[SYCL] Enhance flash-attention performance
#21185
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
Commits
enhance fattn perf
arthw
committed
16 days ago
Loading