transformers
a163c9ca
- [T5] Fix Cross Attention position bias (#4499)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
5 years ago
[T5] Fix Cross Attention position bias (#4499) * fix * fix1
References
#4499 - [T5] Fix Cross Attention position bias
Author
ZhuBaohe
Parents
1d690289
Loading