llama.cpp
ec68e84c - ggml : support bcast ggml_soft_max_ext, ggml_flash_attn_ext (#14435)

Commit
126 days ago
ggml : support bcast ggml_soft_max_ext, ggml_flash_attn_ext (#14435) ggml-ci
Author
Committer
Parents
Loading