llama.cpp
978ba3d8 - Server: Don't ignore llama.cpp params (#8754)

Commit
1 year ago
Server: Don't ignore llama.cpp params (#8754) * Don't ignore llama.cpp params * Add fallback for max_tokens
Author
Parents
Loading