llama.cpp
server, webui: accept continue_final_message flag for vLLM API compat
#23012
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
3
Changes
View On
GitHub
server, webui: accept continue_final_message flag for vLLM API compat
#23012
ServeurpersoCom
merged 3 commits into
ggml-org:master
from
ServeurpersoCom:reasoning-continue-prefill-vllm-compat
ServeurpersoCom
requested a review
3 days ago
ServeurpersoCom
requested a review
3 days ago
allozaur
approved these changes on 2026-05-13
server, webui: accept continue_final_message flag for vLLM API compat
972f4a72
ServeurpersoCom
force pushed
from
918af577
to
972f4a72
3 days ago
ngxson
approved these changes on 2026-05-13
test: add coverage for continue_final_message vLLM compat flag
e4ba4106
chore: update webui build output
310140ec
ngxson
approved these changes on 2026-05-13
aldehir
approved these changes on 2026-05-13
github-actions
added
server/webui
github-actions
added
examples
github-actions
added
python
github-actions
added
server
ServeurpersoCom
merged
95d469a9
into master
2 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
aldehir
ngxson
allozaur
Assignees
No one assigned
Labels
server/webui
examples
python
server
Milestone
No milestone
Login to write a write a comment.
Login via GitHub