Fix inplace check logic to be triggered when written to Tensor does not require gradients (#46296)
Summary:
Fix https://github.com/pytorch/pytorch/issues/46242
This ensures that the `check_inplace()` run the proper checks even if the Tensor that is being modified inplace does not requires gradient. As the Tensor written into it might require gradient and will make this inplace modification actually differentiable.
This contains:
- Codegen changes to tell `check_inplace()` if the inplace will be differentiable
- Changes in `handle_view_on_rebase` to work properly even when called for an input that does not require gradients (which was assumed to be true before)
- Corresponding tests (both warnings and the error raise internal assert errors without this fix)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/46296
Reviewed By: ezyang
Differential Revision: D24903770
Pulled By: albanD
fbshipit-source-id: 74e65dad3d2e3b9f762cbb7b39f92f19d9a0b094