pytorch
3de2c0b4 - Fix L1Loss when target.requires_grad is True. (#44471)

Commit
4 years ago
Fix L1Loss when target.requires_grad is True. (#44471) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/44471 L1Loss had a completely different (and incorrect, see #43228) path when target.requires_grad was True. This PR does the following: 1) adds derivative support for target via the normal derivatives.yaml route 2) kill the different (and incorrect) path for when target.requires_grad was True 3) modify the L1Loss CriterionTests to verify that the target derivative is checked. Test Plan: Imported from OSS Reviewed By: albanD Differential Revision: D23626008 Pulled By: gchanan fbshipit-source-id: 2828be16b56b8dabe114962223d71b0e9a85f0f5
Author
Parents
Loading