[quant][graphmode] FoldConvBatchNorm2d support shared ClassTypes (#32379)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/32379
Folding Conv2d - BatchNorm2d modules means recalculate the weight and bias of Conv2d module by incorproating the parameters
of BatchNorm2d, and also change the method calls to calling only forward of Conv2d module, this involves change of both module
types and graph because the bias of Conv2d is a parameter when it has value and is an attribute when it is
None(since JIT code has assumption of prameter being Tensor in multiple places), therefore
we'll need to remove the bias attribute when it is None and add a bias attribute later. Since ClassType might be shared, we separate
remove and add in separate steps and also keep track of the processed graph to avoid modifying the graph and type multiple times.
However we'll have to record the slot index of bias as well so we can replay the slot removal on other instances of Conv2d module.
Test Plan:
tbd
Imported from OSS
Differential Revision: D20078719
fbshipit-source-id: cee5cf3764f3e0c0a4a2a167b78dbada2e3835cc