llama.cpp
27ebfcac - llama : do not crash if there is no CPU backend (#13395)

Commit
183 days ago
llama : do not crash if there is no CPU backend (#13395) * llama : do not crash if there is no CPU backend * add checks to examples
Author
Parents
Loading