llama.cpp
27ebfcac
- llama : do not crash if there is no CPU backend (#13395)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Hide Minimap (CTRL+M)
Commit
31 days ago
llama : do not crash if there is no CPU backend (#13395) * llama : do not crash if there is no CPU backend * add checks to examples
References
#13395 - llama : do not crash if there is no CPU backend
Author
slaren
Parents
5c86c9ed
Files
7
src
llama-adapter.cpp
llama-model-loader.cpp
llama-model.cpp
tools
main
main.cpp
mtmd
clip.cpp
llava.cpp
rpc
rpc-server.cpp
Loading