llama.cpp
166e60bf - ggml : ggml_flash_attn_ext() support ALiBi (CPU)

Commit
1 year ago
ggml : ggml_flash_attn_ext() support ALiBi (CPU)
Author
Committer
Parents
Loading