Hook up non_differentiability in derivatives.yaml when no autograd function is generated. (#19520)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/19520
ghimport-source-id: a1272aa0b23692fb189974c4daba7b2e4e0dad50
Differential Revision: D15021380
Pulled By: gchanan
fbshipit-source-id: ec83efd4bb6d17714c060f13a0527a33a10452db