llama.cpp
sycl: flash-attention implementation
#16969
Open

Loading