llama.cpp
[SYCL] Enhance flash-attention performance
#21185
Merged

Commits
  • enhance fattn perf
    arthw committed 16 days ago
Loading