llama.cpp
473b0e58
- add the remaining endpoints
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
113 days ago
add the remaining endpoints
References
#17216 - server: split HTTP into its own interface
#35 - PoC llama-cli using server code
#36 - PoC proxy with stream support
#37 - PoC server with fully functional router, model load/unload (multiple models in parallel)
Author
ngxson
Parents
fe98058f
Loading