llama.cpp
8137b4bb
- CPU/CUDA: fix (GQA) mul mat back, add CUDA support (#11380)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
340 days ago
CPU/CUDA: fix (GQA) mul mat back, add CUDA support (#11380)
References
#11380 - CPU/CUDA: fix GQA mul mat back, add CUDA support
Author
JohannesGaessler
Parents
1af6945e
Loading