transformers
c651ea98 - [Grounding DINO] Add support for cross-attention in GroundingDinoMultiHeadAttention (#30364)

Commit
1 year ago
[Grounding DINO] Add support for cross-attention in GroundingDinoMultiHeadAttention (#30364) * Added cross attention support * Fixed dtypes * Fixed assumption * Moved to decoder
Author
Parents
Loading