transformers
update flex attention to use `return_aux` instead of `return_lse` when torch verison >= 2.9
#44684
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
4
Changes
View On
GitHub
update flex attention to use `return_aux` instead of `return_lse` when torch verison >= 2.9
#44684
Cyrilvallez
merged 4 commits into
huggingface:main
from
ntenenz:update-flex
update flex attention to use `return_aux` instead of `return_lse` whe…
7d9b7b69
gambletan
commented on 2026-03-14
vasqu
approved these changes on 2026-03-16
cleaning things up a bit
6d5b2045
Merge branch 'main' into update-flex
d069d51b
vasqu
approved these changes on 2026-03-16
Merge branch 'main' into update-flex
dfeec7a2
Cyrilvallez
approved these changes on 2026-03-18
Cyrilvallez
merged
2bbbbee3
into main
13 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
Cyrilvallez
vasqu
gambletan
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub