transformers
2f1a8ad4
- Fix setting attention for multimodal models (#39984)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
182 days ago
Fix setting attention for multimodal models (#39984) * fix * use non-explicit `None` * keep previously set attn if exists
References
#39984 - Fix setting attention for multimodal models
#59 - Fix attention mask handling in EoMT-DINOv3 converter
#62 - Add initial DEIMv2 model implementation
#65 - Fix RTDetrV2 sine position embedding ordering
Author
zucchini-nlp
Parents
a2e76b90
Loading