pytorch
53c0d91d - Make autograd codegen for differentiable outputs safer to use (#65823)

Commit
4 years ago
Make autograd codegen for differentiable outputs safer to use (#65823) Summary: This PR adds raising an error when `len(output_differentiability) != len(outputs)` Notes in derivatives.yml tell that > 'output_differentiability' and value a list of the same length as the number of outputs from the forward function. but it was not enforced in codegen leading to confusion and unexpected bugs https://github.com/pytorch/pytorch/issues/65061#issuecomment-930271126. cc ezyang albanD zou3519 gqchen pearu nikitaved soulitzer Lezcano Varal7 Pull Request resolved: https://github.com/pytorch/pytorch/pull/65823 Reviewed By: mrshenli Differential Revision: D31307312 Pulled By: albanD fbshipit-source-id: caeb949e9249310dffd237e77871e6d0d784e298
Author
Parents
Loading