Back out "add mixed data type mode for LayerNorm forward path"
Pull Request resolved: https://github.com/pytorch/pytorch/pull/78298
Also back out "improve LayerNorm bfloat16 performance on CPU".
These layer norm changes seem fine, but they are causing `LayerNorm` to not use AVX2 instructions, which is causing performance on internal models to degrade. More investigation is needed to find the true root cause, but we should unland to mitigate the issue ASAP.
I left `mixed_data_type.h` around since there are some other files depending on it.
Differential Revision: [D36675352](https://our.internmc.facebook.com/intern/diff/D36675352/)
Approved by: https://github.com/tenpercent