Add `multi_head_attention_forward` to functional rst docs (#72675)
Summary:
Fixes https://github.com/pytorch/pytorch/issues/72597
Pull Request resolved: https://github.com/pytorch/pytorch/pull/72675
Reviewed By: malfet
Differential Revision: D34154832
Pulled By: jbschlosser
fbshipit-source-id: 7279d05f31d41259e57ba28fe6fdb7079d603660
(cherry picked from commit 68c32cdbd7f7f709888413d649b3f27e48b411de)