Refactor attention for SigLIP based models #36981
Update Siglip attention implementation
58aa6736
Update tests for Siglip
98b185d0
Remove one level of indentation
ab2d4d18
Update test to be more specific
b35be6d6
Fixup
72845f06
Idefics2
2486c942
Idefics3
23d233ba
Emu3
27fbf9af
SmolVLM
d153812f
Phi4 (just init small update)
0bbada48
Idefics2 (test fix)
0d4bf7f3
Update siglip2 tests
3859ebad
Update eager
1e7abf5a
qubvel
marked this pull request as ready for review 346 days ago
trigger
65531ad4
Clean up
209e00e0
Transfer inputs to device in test
81b2ae6f
Fixing test
351b6851
Fixing test
8956fd6b
Revert contiguous
2b5579e2
Remove unused is_flash_attn_2_available
bb481f10
Merge branch 'main' into refactor-attention-siglip-based
80ddd1af
Move flaky to specific models
7052372b
Merge branch 'main' into refactor-attention-siglip-based
23229d84
ydshieh
approved these changes
on 2025-03-28
Merge branch 'main' into refactor-attention-siglip-based
5ebcd87d
Merge branch 'main' into refactor-attention-siglip-based
a7b6fc2d
Merge branch 'main' into refactor-attention-siglip-based
c09e6767
Merge branch 'main' into refactor-attention-siglip-based
f67da416
Merge branch 'main' into refactor-attention-siglip-based
29358168
ydshieh
merged
3249c5dc
into main 339 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub