Update docstrings of torch.nn.modules.activation.MultiheadAttention (#48775)
Summary:
- Add the link to the original paper (Attention is All You Need)
- Fix indentation
Pull Request resolved: https://github.com/pytorch/pytorch/pull/48775
Reviewed By: H-Huang
Differential Revision: D25465914
Pulled By: heitorschueroff
fbshipit-source-id: bbc296ec1523326e323587023c126e820e90ad8d