text-generation-inference
6abec14a
- feat(server): batch tokenization for flash causal lm (#411)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
feat(server): batch tokenization for flash causal lm (#411)
References
#411 - feat(server): batch tokenization for flash causal lm
Author
OlivierDehaene
Parents
895c5f15
Loading