transformers
b11b28cc
- Hotfix: Flash Attention 2 support in Pixtral (#38146)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
228 days ago
Hotfix: Flash Attention 2 support in Pixtral (#38146) setting attention_mask to None when flash_attention_2 is selected Co-authored-by: aurelien.lac <aurelien.lac@lighton.ai>
References
#38146 - Hotfix: Flash Attention 2 support in Pixtral
Author
uminaty
Parents
0e0e5c10
Loading