llama.cpp
sycl: add F16 support for GGML_OP_CEIL
#19306
Merged

Loading