pytorch
964e61ee - [quant][pt2] Handle no conv bias in prepare QAT fusion (#100610)

Commit
1 year ago
[quant][pt2] Handle no conv bias in prepare QAT fusion (#100610) Summary: This commit adds support for conv + BN fusion for the case where conv has no bias. Since the replacement patterns with and without conv bias are substantially different, we perform the replacement for each of these two cases separately. Test Plan: python test/test_quantization.py TestQuantizePT2E.test_prepare_qat_conv_bn_fusion_no_conv_bias Reviewers: jerryzh168, kimishpatel Differential Revision: [D45743510](https://our.internmc.facebook.com/intern/diff/D45743510) Pull Request resolved: https://github.com/pytorch/pytorch/pull/100610 Approved by: https://github.com/jerryzh168
Author
Committer
Parents
Loading