llama.cpp
03c0946d - convert : support models with multiple chat templates (#6588)

Commit
1 year ago
convert : support models with multiple chat templates (#6588) * Support converting models with multiple chat templates Adds the following metadata: * tokenizer.chat_templates * tokenizer.chat_template.<name1> * tokenizer.chat_template.<name2> * tokenizer.chat_template.<...> Where `tokenizer.chat_templates` is an array of the template names (except `default`), `default` is added to the regular `tokenizer.chat_template`. * replace filtered characters with underscore * New script to add/modify/remove metadata This scripts creates a copy of a GGUF file and allows you to add/modify/remove metadata in the process. Most importantly this allows you to update chat templates, either as a string or directly from an updated tokenizer_config.json file. * Add files via upload add new script to project/readme * flake--
Author
Parents
  • gguf-py
    • File
      README.md
    • gguf
      • File
        constants.py
      • File
        gguf_writer.py
      • File
        vocab.py
    • File
      pyproject.toml
    • scripts
      • File
        __init__.py
      • File
        gguf-new-metadata.py