text-generation-inference
feat(docker): improve flash_attention caching
#160
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
feat(docker): improve flash_attention caching
#160
OlivierDehaene
merged 1 commit into
main
from
feat/cache_flash_att
feat(docker): improve flash_attention caching
98094f4d
OlivierDehaene
merged
1883d8ec
into main
2 years ago
OlivierDehaene
deleted the feat/cache_flash_att branch
2 years ago
Login to write a write a comment.
Login via GitHub
Reviewers
No reviews
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub