transformers
fcea1e1f - Fixes Flash Attention implementation for models (#42149)

Commit
56 days ago
Fixes Flash Attention implementation for models (#42149) * flash-att3 fix for smolvlm2 * flash-att3 fix for idefics2 * idefics2 changes * reset idefics2
Author
Parents
Loading