llama.cpp
common: ensure token addition to batch does not exceed llama_batch size
#9668
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
common: ensure token addition to batch does not exceed llama_batch size
#9668
ggerganov
merged 1 commit into
ggml-org:master
from
matiaslin:protect_parallel
github-actions
added
examples
ggerganov
commented on 2024-09-28
github-actions
added
build
github-actions
added
testing
github-actions
added
Vulkan
github-actions
added
python
github-actions
added
devops
github-actions
added
server
github-actions
added
ggml
common: ensure token addition to batch does not exceed llama_batch size
c197f0fc
matiaslin
force pushed
to
c197f0fc
1 year ago
matiaslin
changed the title
parallel: fix adding tokens to batch
common: ensure token addition to batch does not exceed llama_batch size
1 year ago
ggerganov
approved these changes on 2024-09-29
ggerganov
added
merge ready
ggerganov
merged
faac0bae
into master
1 year ago
Login to write a write a comment.
Login via GitHub
Reviewers
ggerganov
Assignees
No one assigned
Labels
build
testing
Vulkan
examples
python
devops
server
ggml
merge ready
Milestone
No milestone
Login to write a write a comment.
Login via GitHub