transformers
3ee431dd
- [Bart/Memory] Two separate, smaller decoder attention masks (#3371)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
5 years ago
[Bart/Memory] Two separate, smaller decoder attention masks (#3371)
References
#3371 - [Bart/Memory] Two separate, smaller decoder attention masks
Author
sshleifer
Parents
53fe7338
Loading