text-generation-inference
feat(server): batch tokenization for flash causal lm
#411
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
2
Changes
View On
GitHub
feat(server): batch tokenization for flash causal lm
#411
OlivierDehaene
merged 2 commits into
main
from
feat/batch_tokenization_flash
feat(server): batch tokenization for flash causal lm
89c5621e
black
e09314a7
OlivierDehaene
merged
6abec14a
into main
2 years ago
OlivierDehaene
deleted the feat/batch_tokenization_flash branch
2 years ago
Login to write a write a comment.
Login via GitHub
Reviewers
No reviews
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub