llama.cpp
server : fix crash when using verbose output with input tokens that are not in printable range (#12178)
#12338
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
6
Changes
View On
GitHub
server : fix crash when using verbose output with input tokens that are not in printable range (#12178)
#12338
ngxson
merged 6 commits into
ggml-org:master
from
ishaangandhi:fix-dos-index
ishaangandhi
requested a review
from
ngxson
282 days ago
ngxson
commented on 2025-03-11
github-actions
added
examples
github-actions
added
server
ngxson
approved these changes on 2025-03-11
ishaangandhi
requested a review
from
JohannesGaessler
281 days ago
github-actions
added
testing
github-actions
added
Nvidia GPU
github-actions
added
Vulkan
github-actions
added
python
github-actions
added
ggml
github-actions
added
SYCL
github-actions
added
Apple Metal
Fix DOS index bug
d8485ade
Remove new APIs
13208979
remove extra line
761f4d92
Remove from API
2e48a6dc
Add extra newline
cc530392
Update examples/server/server.cpp
1e14b14e
ishaangandhi
force pushed
to
1e14b14e
281 days ago
ngxson
changed the title
bugfix: Prevent DOS when using verbose output with input tokens that are not in printable range (#12178)
server : fix crash when using verbose output with input tokens that are not in printable range (#12178)
281 days ago
ngxson
merged
2048b591
into master
281 days ago
ngxson
removed review request
from
JohannesGaessler
281 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
ngxson
Assignees
No one assigned
Labels
testing
Nvidia GPU
Vulkan
examples
python
server
ggml
SYCL
Apple Metal
Milestone
No milestone
Login to write a write a comment.
Login via GitHub