llama.cpp
[Mirror] server: /v1/responses (partial)
#85
Open

[Mirror] server: /v1/responses (partial) #85

ngxson wants to merge 25 commits into ngxson:master from openingnow:v1_responses
ngxson
from previous PR
47134fc1
Make instruction(system) as first message
c41a6d7d
Convert [input_message] (text/image/file)
aa2238ea
Rename convert_responses_to_chatcmpl(body) -> response_body
fd0a13bb
Initial tool call support
f4a87c01
Erase instructions field from chatcmpl body
6e47dea6
Feed reasoning texts to chat template
313ea1e8
Use std::vector instead of opaque json array
7d7058bb
Make output_item.added events consistent
e550290d
Move `server_task_result_cmpl_partial::update` from header to source
97e649e8
Match ID of output_item.added and .done events
d9dca029
Add function_call only if there is no "fc_" prefix
cd9b4cfa
Add function call output at non-streaming API
6c200df3
Test if ID is persistent
63c60135
Add doc
f232a1b9
Fix style - use trailing comma
8a2dd2d5
Rewrite state management
42a6eb38
catch up with upstream/master
5e1f65c0
Fix style - "type" is the first item of SSE data
951fe420
Explicitly check "instructions" from response_body
ebb64386
Make lambdas static
cf83e1ab
Check if reasoning content exists
0d5e3dee
Add `oai_resp_id` to task_result_state(also initialized at ctor), ser…
5ac23d2f
coderabbitai
coderabbitai
coderabbitai commented on 2026-01-21
github-actions github-actions added examples
github-actions github-actions added python
github-actions github-actions added server
Reject `input_file` since it is not supported by chatcmpl
da3ed763
Add "fc_" prefix to non-straming function call id as coderabbit point…
96995a64
ngxson
coderabbitai
openingnow openingnow deleted the v1_responses branch 54 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone