llama.cpp
7a389564
- allow server to multithread
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
allow server to multithread because web browsers send a lot of garbage requests we want the server to multithread when serving 404s for favicon's etc. To avoid blowing up llama we just take a mutex when it's invoked.
References
#1998 - Simple webchat for server
Author
tobi
Committer
tobi
Parents
a30d4b2a
Loading