transformers
Fixes Flash Attention implementation for models
#42149
Merged

Fixes Flash Attention implementation for models #42149

vasqu merged 4 commits into huggingface:main from i3hz:flash3
i3hz
i3hz flash-att3 fix for smolvlm2
3fa08362
i3hz flash-att3 fix for idefics2
5ab8a476
vasqu
vasqu commented on 2025-11-11
i3hz idefics2 changes
86dadcea
i3hz
i3hz reset idefics2
f7abe0ef
github-actions
zucchini-nlp
zucchini-nlp approved these changes on 2025-11-12
zucchini-nlp
github-actions
HuggingFaceDocBuilderDev
github-actions
vasqu
vasqu approved these changes on 2025-11-12
vasqu
vasqu vasqu merged fcea1e1f into main 61 days ago
vasqu
i3hz
i3hz i3hz deleted the flash3 branch 61 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone