Fix GPT-2 Flash Attention 2 generation with left-padding #41966
vasqu
commented
on 2025-11-10
Fix GPT-2 Flash Attention 2 generation with left-padding
239f6129
repo consistency
72edb136
define is_causal in init
2edb434f
vasqu
commented
on 2025-11-10
fix
62440210
vasqu
approved these changes
on 2025-11-10
vasqu
merged
4dd4a8fa
into main 82 days ago
Abdennacer-Badaoui
deleted the fix/test_flash_attn_2_generate_padding_left branch 82 days ago
Assignees
No one assigned