transformers
8c773dcc
- Remove cross-attention class in favor of GroupViTAttention
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
3 years ago
Remove cross-attention class in favor of GroupViTAttention
References
#17313 - Adding GroupViT Models
Author
Niels Rogge
Parents
0655a305
Loading