Remove Redundant Bullet Point (#120007)
Fast path explanation for scaled_dot_product_attention in nn.MultiHeadAttention mentioned inputs being batched with batch_first = True twice. Removed the second mention of this requirement.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/120007
Approved by: https://github.com/mikaylagawarecki