llama.cpp
common: ensure token addition to batch does not exceed llama_batch size
#9668
Merged

common: ensure token addition to batch does not exceed llama_batch size #9668

matiaslin
github-actions github-actions added examples
ggerganov
ggerganov commented on 2024-09-28
matiaslin
github-actions github-actions added build
github-actions github-actions added testing
github-actions github-actions added Vulkan
github-actions github-actions added python
github-actions github-actions added devops
github-actions github-actions added server
github-actions github-actions added ggml
matiaslin common: ensure token addition to batch does not exceed llama_batch size
c197f0fc
matiaslin matiaslin force pushed to c197f0fc 1 year ago
matiaslin
matiaslin matiaslin changed the title parallel: fix adding tokens to batch common: ensure token addition to batch does not exceed llama_batch size 1 year ago
ggerganov
ggerganov approved these changes on 2024-09-29
ggerganov ggerganov added merge ready
ggerganov ggerganov merged faac0bae into master 1 year ago
WilliamTambellini

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone