llama.cpp
27ebfcac - llama : do not crash if there is no CPU backend (#13395)

Commit
31 days ago
llama : do not crash if there is no CPU backend (#13395) * llama : do not crash if there is no CPU backend * add checks to examples
Author
Parents
  • src
    • File
      llama-adapter.cpp
    • File
      llama-model-loader.cpp
    • File
      llama-model.cpp
  • tools
    • main
      • File
        main.cpp
    • mtmd
      • File
        clip.cpp
      • File
        llava.cpp
    • rpc
      • File
        rpc-server.cpp