vocab: refactor tokenizer to reduce the overhead of creating multi times tokenizer #9449
kylo5aby
changed the title refactor tokenizer to reduce the overhead of creating multi times tokenizer vocab: refactor tokenizer to reduce the overhead of creating multi times tokenizer 1 year ago
ngxson
commented
on 2024-09-12
ngxson
commented
on 2024-09-18
ngxson
approved these changes
on 2024-09-19
refactor tokenizer
d949c584
kylo5aby
force pushed
to
d949c584
1 year ago
llama : make llm_tokenizer more private
6e873e56
refactor tokenizer
403758f9
kylo5aby
force pushed
from
d949c584
to
403758f9
1 year ago
Merge branch 'gg/tokenizer-cleanup' of https://github.com/ggerganov/l…
d653d251
refactor tokenizer
2ec25dbf
llama : make llm_tokenizer more private
02629d98
Merge branch 'ggerganov-gg/tokenizer-cleanup' into refactor-tokenizer
8be5d11c
remove unused files
25d4599e
remove unused fileds to avoid unused filed build error
768c43f8
avoid symbol link error
21ee3806
ggerganov
approved these changes
on 2024-09-24
Update src/llama.cpp
95e433ce
Update src/llama.cpp
cd145b11
ggerganov
merged
6102037b
into master 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub