flax
e0fa96fe
- Merge pull request #3835 from google:linen-attention-multiple-initializers
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
Merge pull request #3835 from google:linen-attention-multiple-initializers PiperOrigin-RevId: 632557380
References
#3835 - [linen] enable separate initializers for out layer in MultiHeadDotProductAttention
Author
a-googler
Parents
ee8c55e4
43d022a8
Loading