text-generation-inference
feat(server): batch tokenization for flash causal lm
#411
Merged

feat(server): batch tokenization for flash causal lm #411

OlivierDehaene
OlivierDehaene feat(server): batch tokenization for flash causal lm
89c5621e
OlivierDehaene black
e09314a7
OlivierDehaene OlivierDehaene merged 6abec14a into main 2 years ago
OlivierDehaene OlivierDehaene deleted the feat/batch_tokenization_flash branch 2 years ago

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone