transformers
f84d85ba
- [`FA-2`] Add Flash Attention to `Phi` (#27661)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
[`FA-2`] Add Flash Attention to `Phi` (#27661) * add FA and modify doc file * test_flash_attn_2_generate_padding_right test overwritten * comment * modify persimmon modeling file * added speedup graph * more changes
References
#62 - Add initial DEIMv2 model implementation
#58 - Add EoMT DINOv3 model
#59 - Fix attention mask handling in EoMT-DINOv3 converter
#32831 - [Docs] Update resources
#29969 - [SigLIP] Add fast tokenizer
#41212 - Add EoMT with DINOv3 backbone
#39821 - Support MetaCLIP 2
#33111 - [Backbone] Remove out_features everywhere
#33174 - [Zero-shot image classification pipeline] Remove tokenizer_kwargs
#27661 - [`FA-2`] Add Flash Attention to `Phi`
Author
susnato
Parents
06f56168
Loading