Validate special token ids are in range when loading GGUF model #3635
KerfuffleV2
marked this pull request as draft 2 years ago
KerfuffleV2
changed the title Valid special token ids are in range when loading GGUF model Validate special token ids are in range when loading GGUF model 2 years ago
KerfuffleV2
marked this pull request as ready for review 2 years ago
ggerganov
approved these changes
on 2023-10-17
Add validation for special token ids to llama.cpp
d1075f6e
Fix BPE newline check, only I could break something so simple
14be9d91
Killll meeeeee
32383bbd
Account for GGUF_KEY_KEY only setting when the key exists
4079668c
Minor code cleanups.
22b914e0
Fix convert.py error msg when added tokens are out of range
3a007e2c
Make gguf SpecialVocab vocab size-aware
8796025b
Avoid a string copy
76b05fc4
ggerganov
merged
a5e7dbd6
into master 2 years ago
KerfuffleV2
deleted the fix-handle-bad-special-token-ids branch 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub