transformers
4dd4a8fa
- Fix GPT-2 Flash Attention 2 generation with left-padding (#41966)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
77 days ago
Fix GPT-2 Flash Attention 2 generation with left-padding (#41966) * Fix GPT-2 Flash Attention 2 generation with left-padding * repo consistency * define is_causal in init * fix
References
#41966 - Fix GPT-2 Flash Attention 2 generation with left-padding
Author
Abdennacer-Badaoui
Parents
03538a80
Loading