transformers
1ac2463d - [`FA2`] Add flash attention for for `DistilBert` (#26489)

Commit
2 years ago
[`FA2`] Add flash attention for for `DistilBert` (#26489) * flash attention added for DistilBert * fixes * removed padding_masks * Update modeling_distilbert.py * Update test_modeling_distilbert.py * style fix
Author
Parents
Loading