llama.cpp
166e60bf
- ggml : ggml_flash_attn_ext() support ALiBi (CPU)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
ggml : ggml_flash_attn_ext() support ALiBi (CPU)
References
#7192 - ggml : full ALiBi support
Author
ggerganov
Committer
ggerganov
Parents
d0592d49
Loading