llama.cpp
2c9fe91e
- move the chat template print to init()
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
35 days ago
move the chat template print to init()
References
#17216 - server: split HTTP into its own interface
#36 - PoC proxy with stream support
#37 - PoC server with fully functional router, model load/unload (multiple models in parallel)
Author
ngxson
Parents
68d5c6f8
Loading