transformers
Add packed tensor format support for flex/sdpa/eager through the mask!
#39194
Merged

Add packed tensor format support for flex/sdpa/eager through the mask! #39194

Cyrilvallez merged 16 commits into main from packing-mask
Cyrilvallez
Cyrilvallez Add the necesary logic to mask_utils
8eefa564
Cyrilvallez add it everywhere
bce4ea74
Cyrilvallez Update masking_utils.py
03643555
Cyrilvallez style
f96014ba
HuggingFaceDocBuilderDev
Cyrilvallez Update masking_utils.py
ecef50e0
Cyrilvallez Update modeling_mimi.py
d7fd4211
Cyrilvallez Update masking_utils.py
431fe622
ArthurZucker
ArthurZucker commented on 2025-07-03
winglian
winglian commented on 2025-07-03
Cyrilvallez add support for more than batch size 1
111a3eac
Cyrilvallez Update masking_utils.py
4fed806b
Cyrilvallez add test
810d33f4
Cyrilvallez style
554be7da
Cyrilvallez Update test_masking_utils.py
33dcfb05
Cyrilvallez Update masking_utils.py
7ee171f4
Cyrilvallez add require_token
45e885bb
Cyrilvallez fix tests
f0a3d28d
Cyrilvallez fix
d62f88a2
github-actions
Cyrilvallez Cyrilvallez added for patch
winglian
winglian approved these changes on 2025-07-03
Cyrilvallez Cyrilvallez merged 0cf27916 into main 237 days ago
Cyrilvallez Cyrilvallez deleted the packing-mask branch 237 days ago
BenjaminBossan
Snowdar
Cyrilvallez
Cyrilvallez
BenjaminBossan
ArthurZucker

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone