transformers
Fixes the inconsistency of the optionality of attention_mask
#37153
Merged

Fixes the inconsistency of the optionality of attention_mask #37153

Zephyr271828
github-actions github-actions marked this pull request as draft 333 days ago
github-actions
Zephyr271828 Zephyr271828 changed the title Yufeng xu Fixes the inconsistency of the optionality of attention_mask 333 days ago
Zephyr271828 Zephyr271828 marked this pull request as ready for review 333 days ago
github-actions github-actions requested a review from ArthurZucker ArthurZucker 333 days ago
github-actions github-actions requested a review from Rocketknight1 Rocketknight1 333 days ago
Godofnothing
Godofnothing commented on 2025-03-31
Rocketknight1
Rocketknight1 approved these changes on 2025-04-01
Zephyr271828 debugging issue 36758
b16084e7
Zephyr271828 debugging issue 36758
043c2465
Zephyr271828 debugging issue 36758
d4757b1d
Zephyr271828 updated attn_mask type specification in _flash_attention_forward
7d234f44
Zephyr271828 removed pdb
e66e8275
Zephyr271828 added a blank line
585ce298
Zephyr271828 removed indentation
1cbd59df
Rocketknight1 Rocketknight1 force pushed from f7d8fc8a to 1cbd59df 332 days ago
Zephyr271828 Merge branch 'main' into Yufeng-Xu
516836ea
Rocketknight1 Rocketknight1 merged bf41e54f into main 332 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone