llama.cpp
55ccf46b
- bring back the "server is listening on" message
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
130 days ago
bring back the "server is listening on" message
References
#17216 - server: split HTTP into its own interface
#36 - PoC proxy with stream support
#37 - PoC server with fully functional router, model load/unload (multiple models in parallel)
Author
ngxson
Parents
1bc41f60
Loading