transformers
bfb954fa
- fix attention mask for flash attention
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
275 days ago
fix attention mask for flash attention
References
fix-flash-attention-with-static-cache
Author
qgallouedec
Parents
e34b0733
Loading