transformers
fcea1e1f
- Fixes Flash Attention implementation for models (#42149)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
56 days ago
Fixes Flash Attention implementation for models (#42149) * flash-att3 fix for smolvlm2 * flash-att3 fix for idefics2 * idefics2 changes * reset idefics2
References
#42149 - Fixes Flash Attention implementation for models
Author
i3hz
Parents
563f2ffb
Loading