transformers
c651ea98
- [Grounding DINO] Add support for cross-attention in GroundingDinoMultiHeadAttention (#30364)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
[Grounding DINO] Add support for cross-attention in GroundingDinoMultiHeadAttention (#30364) * Added cross attention support * Fixed dtypes * Fixed assumption * Moved to decoder
References
#30364 - [Grounding DINO] Add support for cross-attention in GroundingDinoMultiHeadAttention
Author
EduardoPach
Parents
408453b4
Loading