text-generation-inference
1883d8ec
- feat(docker): improve flash_attention caching (#160)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
feat(docker): improve flash_attention caching (#160)
References
#160 - feat(docker): improve flash_attention caching
Author
OlivierDehaene
Parents
3f2542bb
Loading