text-generation-inference
90b226db - We can have a tokenizer anywhere. (#2527)

Commit
1 year ago
We can have a tokenizer anywhere. (#2527) * We can have a tokenizer anywhere. * Handling potential lack of offsets (python tokenizer) * Remove redundancy. * Fixing the tests. * Flake.lock update ? * Fixing the GIL locking. * Fixing mamba by using the transformers version. * Adding the legacy handle. * Ellide lifetime. * Lint. * Deprecation message. * Fixing bad rebase.
Author
Parents
Loading