transformers
56c44213
- [detection] fix attention mask for RT-DETR-based models (#40269)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
162 days ago
[detection] fix attention mask for RT-DETR-based models (#40269) * Fix get_contrastive_denoising_training_group attention * Add bool attention_mask conversion
References
#40269 - Fix attention mask for RT-DETR-based models
#59 - Fix attention mask handling in EoMT-DINOv3 converter
#41212 - Add EoMT with DINOv3 backbone
#62 - Add initial DEIMv2 model implementation
Author
materight
Parents
5d9a715e
Loading