llama.cpp
4ebd0c12 - cuda : fix GGML_CUDA_GRAPHS=OFF (#15300)

Commit
28 days ago
cuda : fix GGML_CUDA_GRAPHS=OFF (#15300) * fix USE_CUDA_GRAPH=OFF ggml-ci * check capture status * completely disable capturing check instead
Author
Parents
Loading