fix cdist gradient computation if first arg is 1xn (#26254)
Summary:
Fixes https://github.com/pytorch/pytorch/issues/26076. mruberry if https://github.com/pytorch/pytorch/issues/26248 goes in soon, I'll rebase after it, otherwise this should go in because it's a bug fix.
Side note: cdist backward testing is very light and I suspect is not testing all the code paths, but that's a separate issue.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26254
Test Plan: added test for the affected size to test_autograd.py. Streams are tested by existing tests.
Differential Revision: D17480945
Pulled By: ngimel
fbshipit-source-id: 0f18c9fd05e462d22c410a2ebddc2bcc9580582d
Author
Natalia Gimelshein