pytorch
91611fe1 - Decouple forward AD checks from backward AD in OpInfo tests and gradcheck (#65040)

Commit
4 years ago
Decouple forward AD checks from backward AD in OpInfo tests and gradcheck (#65040) Summary: Fixes https://github.com/pytorch/pytorch/issues/64999 - Adds a flag to gradcheck `check_backward_ad` that can be used to disable gradcheck for backward ad - This is a bit bc-breaking in terms of positional args, but I prefer this ordering - In OpInfo tests for forward ad: - set `check_backward_ad` False - In test_ops treat `supports_autograd` as if it is `supports_backward_ad` (it basically already is) - the only modification needed is to no longer skip forward ad tests if `supports_autograd` is false - test_dtype, test_variant_consistency, etc behave correctly as-is - In a follow-up PR, we can rename it to actually be `supports_backward_ad` - Testing - https://github.com/pytorch/pytorch/pull/65060 Pull Request resolved: https://github.com/pytorch/pytorch/pull/65040 Reviewed By: albanD Differential Revision: D31238177 Pulled By: soulitzer fbshipit-source-id: f068d4cbe7ffb094930b16cddb210583b9b7b2c4
Author
Parents
Loading