llama.cpp
1174def5
- metal : fix flash attention kernel requirements
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
metal : fix flash attention kernel requirements ggml-ci
References
#7169 - metal : fix flash attention kernel requirements
Author
ggerganov
Parents
07cd41d0
Loading