Improve AOTAutograd tests to do something when inputs don't require grad (#106558)
This PR:
- Changes the AOTAutograd tests to also check that the output of the
forward is equal under AOTAutograd and eager-mode PyTorch.
- Adds a "check_gradients" flag to `check_aot_autograd`.
- If True, then we attempt to compute gradients and check them.
- If False, then we we just check the outputs are equal
- If "auto", then we will compute gradients and check them only if
some input and some output requires grad. This option is useful for
crossref tests where we don't necessarily have inputs that require
grad.
1) I need a testing utility to test "AOTAutograd for inference",
e.g. make_fx + functionalize.
2) I want to run aot_autograd_check in crossref tests for other test
suites (e.g. fbgemm) and not all inputs require grad.
Test Plan:
- existing tests
- new tests to test the degenerate cases
Pull Request resolved: https://github.com/pytorch/pytorch/pull/106558
Approved by: https://github.com/ezyang, https://github.com/soulitzer