llama.cpp
1f8a5924
- cuda : make loops use the same loop values
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
cuda : make loops use the same loop values Thanks Johannes again for the tip
References
#5021 - ggml : add Flash Attention
Author
ggerganov
Parents
7c34655b
Loading