Do not generate not_implemented error for forward AD when input with tangent passed to non-differentiable function (#66926)
Summary:
Fixes https://github.com/pytorch/pytorch/issues/61926
1. update the `if` to just use requires_derivative since that should reflect when function is not differentiable
2. if `requires_derivative=True` but no outputs have forward derivatives, we should error as usual
3. ~In the future we may also want to handle the case~ when `len(fw_derivatives) > 0 and len(fw_derivatives) < num_diff_outputs` we should add assert in codegen that this does not happen.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/66926
Reviewed By: anjali411
Differential Revision: D31810736
Pulled By: soulitzer
fbshipit-source-id: 11a14477cc7554f576cff2ed1711a448a8c6a66a