llama.cpp
9917e044
- add NOMINMAX
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
39 days ago
add NOMINMAX
References
#17216 - server: split HTTP into its own interface
#36 - PoC proxy with stream support
#37 - PoC server with fully functional router, model load/unload (multiple models in parallel)
Author
ngxson
Parents
3be8a3ac
Loading