[ONNX] add support for prim::Unitialized in lower_tuples pass (#56912)
Summary:
Fixes https://github.com/pytorch/pytorch/issues/56911
Code from issue generates this Torchscript:
```
graph(%self : __torch__.MyModule,
%t.1 : Tensor):
%12 : None = prim::Constant()
%7 : str = prim::Constant[value="Negative input"]() # /mnt/nvdl/usr/msladek/notes/python_code/unitialized.py:11:28
%3 : int = prim::Constant[value=0]() # /mnt/nvdl/usr/msladek/notes/python_code/unitialized.py:10:15
%9 : int = prim::Constant[value=5]() # /mnt/nvdl/usr/msladek/notes/python_code/unitialized.py:13:31
%33 : (Tensor, Tensor) = prim::Uninitialized()
%4 : Tensor = aten::lt(%t.1, %3) # /mnt/nvdl/usr/msladek/notes/python_code/unitialized.py:10:11
%6 : bool = aten::Bool(%4) # /mnt/nvdl/usr/msladek/notes/python_code/unitialized.py:10:11
%34 : (Tensor, Tensor) = prim::If(%6) # /mnt/nvdl/usr/msladek/notes/python_code/unitialized.py:10:8
block0():
= prim::RaiseException(%7) # /mnt/nvdl/usr/msladek/notes/python_code/unitialized.py:11:12
-> (%33)
block1():
%11 : int[] = prim::ListConstruct(%9)
%16 : Tensor = aten::zeros(%11, %12, %12, %12, %12) # /mnt/nvdl/usr/msladek/notes/python_code/unitialized.py:13:19
%18 : int[] = prim::ListConstruct(%9)
%23 : Tensor = aten::zeros(%18, %12, %12, %12, %12) # /mnt/nvdl/usr/msladek/notes/python_code/unitialized.py:13:35
%24 : (Tensor, Tensor) = prim::TupleConstruct(%16, %23)
-> (%24)
return (%34)
```
Problem is that onnx exporter during lower_tuples pass doesn't support forwarding of tuples in prim::Unitialized.
Solution is:
1. add prim::Unitialized to supported_op in lower_tuples pass
1. As prim::Unitialized has now multiple outputs, we should call giveFreshAlias for every output
Pull Request resolved: https://github.com/pytorch/pytorch/pull/56912
Reviewed By: nikithamalgifb
Differential Revision: D29837200
Pulled By: SplitInfinity
fbshipit-source-id: 321fae6fe52b1523df5653dbb9ea73b998ef1cda