llama.cpp
78203641 - server : Add option to return token pieces in /tokenize endpoint (#9108)

Comment changes are shownComment changes are hidden
Commit
293 days ago
server : Add option to return token pieces in /tokenize endpoint (#9108) * server : added with_pieces functionality to /tokenize endpoint * server : Add tokenize with pieces tests to server.feature * Handle case if tokenizer splits along utf8 continuation bytes * Add example of token splitting * Remove trailing ws * Fix trailing ws * Maybe fix ci * maybe this fix windows ci? --------- Co-authored-by: Xuan Son Nguyen <son@huggingface.co>
Parents
  • .github/workflows
    • File
      server.yml
  • examples/server
    • File
      README.md
    • File
      server.cpp
    • tests/features
      • File
        server.feature
      • steps
        • File
          steps.py
    • File
      utils.hpp