llama.cpp
context : fix reserve token padding to n_seqs
#18536
Merged

context : fix reserve token padding to n_seqs #18536

ggerganov merged 1 commit into master from gg/llama-fix-reserve
ggerganov
ggerganov context : fix reserve token padding to n_seqs
9f0b151c
danbev
danbev approved these changes on 2026-01-02
ggerganov ggerganov merged a554a1ec into master 23 days ago
ggerganov ggerganov deleted the gg/llama-fix-reserve branch 23 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone