Refactored ops on size to be dispatcher ops (#83719)
An example of how the graph looks now.
```
def forward(self, x_1):
size = torch.ops.math.size(x_1, 0)
size_1 = torch.ops.math.size(x_1, 1); x_1 = None
ones = torch.ops.aten.ones.default([1], device = device(type='cpu'), pin_memory = False)
expand_sym_int = torch.ops.aten.expand.SymInt(ones, [size, size_1]); ones = size = size_1 = None
cos_default = torch.ops.aten.cos.default(expand_sym_int); expand_sym_int = None
return (cos_default,)
```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/83719
Approved by: https://github.com/ezyang