llama.cpp
978ba3d8 - Server: Don't ignore llama.cpp params (#8754)

Commit
321 days ago
Server: Don't ignore llama.cpp params (#8754) * Don't ignore llama.cpp params * Add fallback for max_tokens
Author
Parents
  • examples/server
    • File
      server.cpp
    • File
      utils.hpp