transformers
1ac2463d
- [`FA2`] Add flash attention for for `DistilBert` (#26489)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
[`FA2`] Add flash attention for for `DistilBert` (#26489) * flash attention added for DistilBert * fixes * removed padding_masks * Update modeling_distilbert.py * Update test_modeling_distilbert.py * style fix
References
#26489 - [`FA2`] Add flash attention for for `DistilBert`
Author
susnato
Parents
5964f820
Loading