llama.cpp
45b2fe19
- server: split HTTP into its own interface
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
41 days ago
server: split HTTP into its own interface
References
#17216 - server: split HTTP into its own interface
#35 - PoC llama-cli using server code
#36 - PoC proxy with stream support
#37 - PoC server with fully functional router, model load/unload (multiple models in parallel)
Author
ngxson
Committer
ngxson
Parents
00c94083
Loading