llama.cpp
server : Add option to return token pieces in /tokenize endpoint
#9108
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
9
Changes
View On
GitHub
server : Add option to return token pieces in /tokenize endpoint
#9108
ngxson
merged 9 commits into
ggml-org:master
from
mathijshenquet:feature/tokenize-with-pieces
server : added with_pieces functionality to /tokenize endpoint
a2d4d191
github-actions
added
examples
github-actions
added
python
github-actions
added
server
ggerganov
approved these changes on 2024-08-21
ngxson
requested changes on 2024-08-21
server : Add tokenize with pieces tests to server.feature
198daa4e
Handle case if tokenizer splits along utf8 continuation bytes
b11e63ce
Add example of token splitting
42fb6707
Remove trailing ws
0c5baa1c
mathijshenquet
requested a review
from
ngxson
1 year ago
Fix trailing ws
0d198bbf
Maybe fix ci
0fbed972
github-actions
added
devops
ngxson
approved these changes on 2024-08-27
mofosyne
added
Review Complexity : Low
Merge branch 'master' into feature/tokenize-with-pieces
ad971140
maybe this fix windows ci?
661a740d
ngxson
merged
78203641
into master
1 year ago
Login to write a write a comment.
Login via GitHub
Reviewers
ngxson
ggerganov
Assignees
No one assigned
Labels
examples
Review Complexity : Low
python
devops
server
Milestone
No milestone
Login to write a write a comment.
Login via GitHub