transformers
7b3d4df4
- fix: #14486 do not use BertPooler in DPR (#15068)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
3 years ago
fix: #14486 do not use BertPooler in DPR (#15068) * fix: #14486 do not use BertPooler in DPR * fix tf dpr as well * finish Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
References
#15068 - fix: #14486 do not use BertPooler in DPR
#15748 - Fix segformer reshape last stage
#19449 - [WIP] Fix weights initialization of several vision models
#27720 - Add common processor tests
#29969 - [SigLIP] Add fast tokenizer
#32831 - [Docs] Update resources
#33111 - [Backbone] Remove out_features everywhere
#33174 - [Zero-shot image classification pipeline] Remove tokenizer_kwargs
#39821 - Support MetaCLIP 2
#58 - Add EoMT DINOv3 model
#59 - Fix attention mask handling in EoMT-DINOv3 converter
#41212 - Add EoMT with DINOv3 backbone
#62 - Add initial DEIMv2 model implementation
Author
PaulLerner
Parents
74bec986
Loading