transformers
b11b28cc - Hotfix: Flash Attention 2 support in Pixtral (#38146)

Commit
228 days ago
Hotfix: Flash Attention 2 support in Pixtral (#38146) setting attention_mask to None when flash_attention_2 is selected Co-authored-by: aurelien.lac <aurelien.lac@lighton.ai>
Author
Parents
Loading