Call jit decomposition in VariableType to increase forward AD coverage (#84151)
This PR:
- updates forward AD codegen in core to generate code that tries calling into decompositions registered to jit when
- (1) the function is not in-place or out variant
- AND (2) the function is differentiable (requires_derivative=True)
- AND (3) there are no forward AD formulas registered
- To simplify things we always generating the if/else (as long as (1) is true), but generate 'false' when either (2) or (3) are false.
- removes the mechanism from functorch
- (follow up) some functorch tests should be updated here so they no longer have to compute the Jacobian with vjp
- factors out some logic to generate the any_has_forward_grad condition
- (bc-breaking) when TensorList inputs unexpectedly have forward grad, the error will no longer contain the name
See https://github.com/pytorch/pytorch/pull/84151#issuecomment-1238519247 for codegen output and more discussion.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/84151
Approved by: https://github.com/samdow, https://github.com/albanD, https://github.com/zou3519