[functorch] Disable calling Tensor.requires_grad_() inside a functorch transform (pytorch/functorch#849)
Fixes pytorch/functorch#847
We do not allow users to call requires_grad_() inside a functorch
transform. This is because the user is effectively saying
"hey, I want another layer of autograd if I call requires_grad_()", but
that doesn't actually work because to set up a layer of autograd we need
to do some work (e.g. push autograd onto the DynamicLayerStack).
Instead, when a user calls requires_grad_() (and similarly retain_grad),
we raise a nice error message.
This has the intended consequence of causing
torch.autograd.functional.{jvp, vjp, jacobian} to error out when called
inside of a functorch transform. Users should use the functorch
equivalent.
Test Plan:
- added tests