transformers
4dd4a8fa - Fix GPT-2 Flash Attention 2 generation with left-padding (#41966)

Commit
77 days ago
Fix GPT-2 Flash Attention 2 generation with left-padding (#41966) * Fix GPT-2 Flash Attention 2 generation with left-padding * repo consistency * define is_causal in init * fix
Parents
Loading