fix to accept cumulative_seqlens from TransformersKwargs in FA #40194
Kurt232
changed the title fix to accept cumulative_seqlens from TransformersKwargs in FA #40193 fix to accept cumulative_seqlens from TransformersKwargs in FA 204 days ago
fix to the typings which are unmatched to FA function signature
dc0624db
Kurt232
force pushed
from
fedbf6d0
to
dc0624db
198 days ago
format changes by ruff
a9cd0b22
vasqu
commented
on 2025-08-21
Kurt232
changed the title fix to accept cumulative_seqlens from TransformersKwargs in FA 🚨 fix to accept cumulative_seqlens from TransformersKwargs in FA 198 days ago
Update src/transformers/integrations/flash_paged.py
a1415149
revert continuous_batching signiture, which is more meaningful
d93cdd1e
Kurt232
changed the title 🚨 fix to accept cumulative_seqlens from TransformersKwargs in FA fix to accept cumulative_seqlens from TransformersKwargs in FA 197 days ago
Kurt232
deleted the fix/args_in_flash_attention_forward branch 194 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub