transformers
Fix attention mask for RT-DETR-based models
#40269
Merged

Loading