flax
ad90642b - Remove GeGLU activation function and golden tests.

Commit
1 year ago
Remove GeGLU activation function and golden tests. GeGLU is not a simple activation function, but a gated linear layer used in modern MLPs. Our users are not well served by a baked-in implementation of a linear layer presented as a simple activation function. PiperOrigin-RevId: 689524209
Author
Committer
Parents
Loading