pytorch
1de3525c - [ONNX] Handle PackedParams inputs for _propagate_and_assign_input_shapes (#56449) (#57079)

Commit
3 years ago
[ONNX] Handle PackedParams inputs for _propagate_and_assign_input_shapes (#56449) (#57079) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/57079 Testing onnx 1.9 release, we see that the old bug is triggered for the caffe2 test: `pytest test/onnx/test_pytorch_onnx_caffe2_quantized.py::TestQuantizedOps::test_small_model` This is because the graph inputs ```python graph(%x.1 : Tensor, %conv1._packed_params : __torch__.torch.classes.quantized.Conv2dPackedParamsBase, %conv2._packed_params : __torch__.torch.classes.quantized.Conv2dPackedParamsBase, %fc.bias : Float(10, strides=[1], requires_grad=0, device=cpu), %fc.weight : Float(10, 72, strides=[72, 1], requires_grad=0, device=cpu)): ``` contains `Conv2dPackedParamsBase` which is a PackedParams. When we do flatten, we will flatten to several tensors, then the shape inference for input misaligned. This PR record how may tensors got flattened in PackeParams, and skip by these number rather than 1, then the UT passed. Note that tuple case should still follow the original logic. Test Plan: Imported from OSS Reviewed By: SplitInfinity Differential Revision: D28393949 Pulled By: malfet fbshipit-source-id: 98d48aad27e5ca03fb10d260f8e625478d996ee2 Co-authored-by: David <jiafa@microsoft.com>
Author
Parents
Loading