Retry - [JIT] Propagate profiled information to DifferentiableGraph outputs
Without profiled outputs, autodiff can't tell whether or not the outputs of a DifferentiableGraph should requires_grad. Autodiff would default to requires_grad=True if there was no profiled information, causing autodiff to mark tensors as requires_grad when they shouldn't have. This adds requires_grad info onto the type of the output, if it can be found in later uses of the output.
Adds a test for correct autodiff requires_grad behavior and also a test to make sure the output type is correctly annotated in create_autodiff_subgraphs.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/79498
Approved by: https://github.com/eellison