llama.cpp
server, webui: accept continue_final_message flag for vLLM API compat
#23012
Merged

server, webui: accept continue_final_message flag for vLLM API compat #23012

ServeurpersoCom
ServeurpersoCom ServeurpersoCom requested a review 3 days ago
ServeurpersoCom ServeurpersoCom requested a review 3 days ago
ServeurpersoCom
allozaur
allozaur approved these changes on 2026-05-13
ServeurpersoCom server, webui: accept continue_final_message flag for vLLM API compat
972f4a72
ServeurpersoCom ServeurpersoCom force pushed from 918af577 to 972f4a72 3 days ago
ngxson
ngxson approved these changes on 2026-05-13
ServeurpersoCom
ServeurpersoCom test: add coverage for continue_final_message vLLM compat flag
e4ba4106
ServeurpersoCom chore: update webui build output
310140ec
ngxson
ngxson approved these changes on 2026-05-13
aldehir
aldehir approved these changes on 2026-05-13
github-actions github-actions added server/webui
github-actions github-actions added examples
github-actions github-actions added python
github-actions github-actions added server
ServeurpersoCom ServeurpersoCom merged 95d469a9 into master 2 days ago
CISC
aldehir

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone