[SR] Drop support for aten::__is__ and aten::__isnot__ (#67550)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/67550
`aten::__is__` and `aten::__isnot__` are extremely problematic for a large number of SR graph optimizations.
Some examples:
- Removing ops that are no-ops in the forward pass like `aten::detach`. This would normally be trivial, but `is` introduces corner cases like this:
```
def forward(x):
y = x.detach()
return x is y
```
We get `False` before optimizations. But after optimizations, the test becomes `x is x`, and we get `True`.
- `ReplaceWithCopy`: the pass that replaces ops like `aten::to` with an out variant that copies its input. The following graph returns `True` before optimizations, but `False` afterwards
```
def forward(x):
y = x.to(x.dtype)
return x is y
```
- And many more, `FuseListUnpack` can break too
Since the ops are not used by 99.99% of users, rejecting them so we don't have to think about this is not a big deal.
Test Plan: `buck test caffe2/benchmarks/static_runtime:static_runtime_cpptest`
Reviewed By: d1jang
Differential Revision: D32022584
fbshipit-source-id: d135938edb2299c9b8f9511afac2bf568578879e