llama.cpp
fa7ebcca
- ggml : fix GQA support in ggml_flash_attn_ext
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
ggml : fix GQA support in ggml_flash_attn_ext
References
#5021 - ggml : add Flash Attention
Author
ggerganov
Parents
a1c004ef
Loading