llama.cpp
afc4a7de
- llama : enable flash attn automatically when supported
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
llama : enable flash attn automatically when supported
References
sl/auto-flash-attn
#10101 - llama : enable flash attn automatically when supported (WIP)
Author
slaren
Committer
slaren
Parents
b9e02e81
Loading