Fix flash-attn for paged_attention when no kernels #41078
Fix non-kernels flash attention paged implementation
9a910f28
Cover all cases
77dbd8e8
Style
703b32a2
remi-or
changed the title Fix fa Fix flash-attn for paged_attention when no kernels 102 days ago
MekkCyber
approved these changes
on 2025-09-25
Update src/transformers/integrations/flash_paged.py
687cd7c4
Merge branch 'main' into fix-fa
c5b20dbc
Apply style fixes
1d8f0cda
Merge branch 'main' into fix-fa
1c8e6604
remi-or
merged
97ca0b47
into main 99 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub