transformers
Fix GPT-2 Flash Attention 2 generation with left-padding
#41966
Merged

Fix GPT-2 Flash Attention 2 generation with left-padding #41966

Abdennacer-Badaoui
Abdennacer-Badaoui Abdennacer-Badaoui force pushed from 2877ab9b to 4f20a1fa 92 days ago
HuggingFaceDocBuilderDev
Abdennacer-Badaoui
remi-or remi-or requested a review from Cyrilvallez Cyrilvallez 89 days ago
remi-or remi-or requested a review from vasqu vasqu 89 days ago
remi-or remi-or removed review request from Cyrilvallez Cyrilvallez 89 days ago
Cyrilvallez
Cyrilvallez dismissed these changes on 2025-11-10
Cyrilvallez Cyrilvallez dismissed their stale review 82 days ago
did not mean to approve oupsi
vasqu
vasqu commented on 2025-11-10
vasqu
Abdennacer-Badaoui
Abdennacer-Badaoui Fix GPT-2 Flash Attention 2 generation with left-padding
239f6129
Abdennacer-Badaoui repo consistency
72edb136
Abdennacer-Badaoui define is_causal in init
2edb434f
Abdennacer-Badaoui Abdennacer-Badaoui force pushed from 015bba0f to 2edb434f 82 days ago
github-actions
vasqu
vasqu commented on 2025-11-10
vasqu
Abdennacer-Badaoui fix
62440210
github-actions
vasqu
github-actions
github-actions
github-actions
vasqu
vasqu approved these changes on 2025-11-10
vasqu
Abdennacer-Badaoui
vasqu vasqu merged 4dd4a8fa into main 82 days ago
vasqu
Abdennacer-Badaoui Abdennacer-Badaoui deleted the fix/test_flash_attn_2_generate_padding_left branch 82 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone