text-generation-inference
Fix cache block size for flash decoding
#2351
Merged

Fix cache block size for flash decoding #2351

danieldk merged 2 commits into main from bugfix/flash-decoding-blocksize
danieldk
danieldk Fix cache block size for flash decoding
278697cf
danieldk Also run CI on changes to `backends`
f484bcb5
danieldk danieldk merged 22fb1be5 into main 1 year ago
danieldk danieldk deleted the bugfix/flash-decoding-blocksize branch 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone