[fix] torch.nn.functional.embedding -> padding_idx behavior (#46714)
Summary:
Reference https://github.com/pytorch/pytorch/issues/46585
Fix for second snippet in the mentioned issue.
```python
predefined_weights = torch.rand(10, 3)
result = torch.nn.functional.embedding(torch.LongTensor([1,2,0]), predefined_weights, padding_idx=0)
```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/46714
Reviewed By: VitalyFedyunin
Differential Revision: D24593352
Pulled By: albanD
fbshipit-source-id: 655b69d9ec57891871e26feeda2aa0dcff73beba