llama.cpp
910b15bb
- ggml : fix ggml_soft_max mask requirement
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
ggml : fix ggml_soft_max mask requirement
References
#5021 - ggml : add Flash Attention
Author
ggerganov
Parents
8ad92dc1
Loading