transformers
c30a6b7b
- [pagged-attention] fix off-by-1 error in pagged attention generation (#39258)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
168 days ago
[pagged-attention] fix off-by-1 error in pagged attention generation (#39258) * fix off-by-1 error in pagged attention generation * formatting * use update_with_token
Author
kashif
Committer
Cyrilvallez
Parents
be7d1a9d
Loading