transformers
bf41e54f - Fixes the inconsistency of the optionality of attention_mask (#37153)

Commit
330 days ago
Fixes the inconsistency of the optionality of attention_mask (#37153) * debugging issue 36758 * debugging issue 36758 * debugging issue 36758 * updated attn_mask type specification in _flash_attention_forward * removed pdb * added a blank line * removed indentation
Author
Parents
Loading