transformers
update flex attention to use `return_aux` instead of `return_lse` when torch verison >= 2.9
#44684
Merged

update flex attention to use `return_aux` instead of `return_lse` when torch verison >= 2.9 #44684

Cyrilvallez merged 4 commits into huggingface:main from ntenenz:update-flex
ntenenz
update flex attention to use `return_aux` instead of `return_lse` whe…
7d9b7b69
gambletan
gambletan commented on 2026-03-14
vasqu
vasqu approved these changes on 2026-03-16
cleaning things up a bit
6d5b2045
ntenenz Merge branch 'main' into update-flex
d069d51b
vasqu
vasqu approved these changes on 2026-03-16
ntenenz
vasqu
Cyrilvallez Merge branch 'main' into update-flex
dfeec7a2
github-actions
Cyrilvallez
Cyrilvallez approved these changes on 2026-03-18
Cyrilvallez Cyrilvallez merged 2bbbbee3 into main 13 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone