context : fix reserve token padding to n_seqs #18536
context : fix reserve token padding to n_seqs
9f0b151c
danbev
approved these changes
on 2026-01-02
ggerganov
merged
a554a1ec
into master 23 days ago
ggerganov
deleted the gg/llama-fix-reserve branch 23 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub