Fix memory leak in torch._dirichlet_grad() (#20244)
Summary:
Fixes https://github.com/pyro-ppl/pyro/issues/1853
This fixes a memory leak in `torch._dirichlet_grad()`. This function is used for reparametrized gradients for the `Dirichlet` and `Beta` distributions.
- [x] Could a reviewer please confirm that `freeCopyTo()` is being used correctly and doesn't need an additional `decref()`? The author is unfamiliar with PyTorch C++ memory utilities. Help appreciated.
- ran locally and confirmed leak is fixed
Pull Request resolved: https://github.com/pytorch/pytorch/pull/20244
Differential Revision: D15259008
Pulled By: ezyang
fbshipit-source-id: 222ec7d80ddd97bcdd7d54549f3e756575e8402e