llama.cpp
4ebd0c12
- cuda : fix GGML_CUDA_GRAPHS=OFF (#15300)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
28 days ago
cuda : fix GGML_CUDA_GRAPHS=OFF (#15300) * fix USE_CUDA_GRAPH=OFF ggml-ci * check capture status * completely disable capturing check instead
References
#15300 - cuda : fix GGML_CUDA_GRAPHS=OFF
Author
CISC
Parents
5cdb27e0
Loading