DOC Improve documentation for LayerNorm (#63144)
Summary:
In this [commit](https://github.com/pytorch/pytorch/pull/59178/commits/7026995f3ca253fbc19bf511d53f48f861799a4a) and [issue](https://github.com/pytorch/pytorch/pull/59178#issuecomment-897485295), the [Line 134](https://github.com/deniskokarev/pytorch/blob/47e286d024c183cb26a464447b34fde88b80d17d/torch/nn/modules/normalization.py#L134) will overwrite the "embedding" variate which would cause an error when initiating `nn.LayerNorm` function.
I suggest renaming the "embedding" in [Line 133](https://github.com/deniskokarev/pytorch/blob/47e286d024c183cb26a464447b34fde88b80d17d/torch/nn/modules/normalization.py#L133) to "embedding_dim".
The final example is:
```
batch, sentence_length, embedding_dim = 20, 5, 10
embedding = torch.randn(batch, sentence_length, embedding_dim)
layer_norm = nn.LayerNorm(embedding_dim)
```
Fixes #{59178}
Pull Request resolved: https://github.com/pytorch/pytorch/pull/63144
Reviewed By: bdhirsh
Differential Revision: D30288778
Pulled By: jbschlosser
fbshipit-source-id: e74b11430e302dae5661bf6e830ee5ac6c1838c4