text-generation-inference
feat(docker): improve flash_attention caching
#160
Merged

feat(docker): improve flash_attention caching #160

OlivierDehaene merged 1 commit into main from feat/cache_flash_att
OlivierDehaene
OlivierDehaene feat(docker): improve flash_attention caching
98094f4d
OlivierDehaene OlivierDehaene merged 1883d8ec into main 2 years ago
OlivierDehaene OlivierDehaene deleted the feat/cache_flash_att branch 2 years ago

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone