transformers
0a52bd24 - [fix] sliding window attention mask (#38045)

Commit
239 days ago
[fix] sliding window attention mask (#38045) * fix sliding attn * make style * Update tests/test_modeling_common.py Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> * no a second throught, should default to `True` fo BC --------- Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>
Author
Parents
Loading