Remove GeGLU activation function and golden tests.
GeGLU is not a simple activation function, but a gated linear layer used in modern MLPs.
Our users are not well served by a baked-in implementation of a linear layer presented as
a simple activation function.
PiperOrigin-RevId: 689524209