llama.cpp
server: remove the verbose_prompt parameter
#21059
Merged

server: remove the verbose_prompt parameter #21059

aisk
aisk server: respect the verbose_prompt parameter
8ed885cf
aisk aisk requested a review 6 days ago
ggml-gh-bot
ServeurpersoCom
ServeurpersoCom approved these changes on 2026-03-27
github-actions github-actions added examples
github-actions github-actions added server
aisk
CISC
CISC CISC closed this 6 days ago
aisk
aisk aisk deleted the server-verbose-prompt branch 6 days ago
CISC
aisk Revert "server: respect the verbose_prompt parameter"
9ff0157e
aisk Remove --verbose-prompt parameter from llama-server
e2beda86
aisk
aisk aisk changed the title server: respect the verbose_prompt parameter server: remove the verbose_prompt parameter 6 days ago
CISC CISC reopened this 6 days ago
CISC CISC requested a review 6 days ago
ngxson
ngxson commented on 2026-03-27
aisk Using set_examples instead of set_excludes
530e1729
ngxson
ngxson approved these changes on 2026-03-27
ggerganov
ggerganov approved these changes on 2026-03-27
ggerganov ggerganov merged 48cda24c into master 6 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone