llama.cpp
sampling: make top_n_sigma no-op at <=0 rather than <0
#13345
Merged

Loading