llama.cpp
server: free sampling contexts on exit
#7264
Merged

Loading