Remove unnecessary whitespace in complex tensors (#36331)
Summary:
This PR addresses Issue https://github.com/pytorch/pytorch/issues/36279.
Previously, printing of complex tensors would sometimes yield extra spaces before the elements as shown below:
```
print(torch.tensor([[1 + 1.340j, 3 + 4j], [1.2 + 1.340j, 6.5 + 7j]], dtype=torch.complex64))
```
would yield
```
tensor([[(1.0000 + 1.3400j),
(3.0000 + 4.0000j)],
[(1.2000 + 1.3400j),
(6.5000 + 7.0000j)]], dtype=torch.complex64)
```
This occurs primarily because when the max width for the element is being assigned, the formatter's max_width is calculated prior to truncating the float values. As a result, ```self.max_width``` would end up being much longer than the final length of the element string to be printed.
I address this by adding a boolean variable that checks if a complex tensor contains only ints and change the control flow for calculating ```self.max_width``` accordingly.
Here are some sample outputs of both float and complex tensors:
```
tensor([[0., 0.],
[0., 0.]], dtype=torch.float64)
tensor([[(0.+0.j), (0.+0.j)],
[(0.+0.j), (0.+0.j)]], dtype=torch.complex64)
tensor([1.2000, 1.3400], dtype=torch.float64)
tensor([(1.2000+1.3400j)], dtype=torch.complex64)
tensor([[(1.0000+1.3400j), (3.0000+4.0000j)],
[(1.2000+1.3400j), (6.5000+7.0000j)]], dtype=torch.complex64)
tensor([1.0000, 2.0000, 3.0000, 4.5000])
tensor([(1.+2.j)], dtype=torch.complex64)
```
cc ezyang anjali411 dylanbespalko
Pull Request resolved: https://github.com/pytorch/pytorch/pull/36331
Differential Revision: D20955663
Pulled By: anjali411
fbshipit-source-id: c26a651eb5c9db6fcc315ad8d5c1bd9f4b4708f7