[pt][quant] Unify numerics between fakequant and quant/dequant (#37188)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/37188
Add zero point after rounding in both fakequant and quant.
ghstack-source-id: 103231624
Test Plan:
buck test //caffe2/test:quantization -- --print-passing-details
```
Finished test run: https://our.intern.facebook.com/intern/testinfra/testrun/4222124675094587
Summary (total time 186.50s):
PASS: 191
FAIL: 0
SKIP: 20
caffe2/test:quantization - test_numerical_consistency_per_tensor (quantization.test_fake_quant.TestFakeQuantizePerTensor)
caffe2/test:quantization - test_numerical_consistency_per_channel (quantization.test_fake_quant.TestFakeQuantizePerChannel)
caffe2/test:quantization - test_backward_per_tensor (quantization.test_fake_quant.TestFakeQuantizePerTensor)
caffe2/test:quantization - test_qadd_scalar_relu (quantization.test_quantized.TestQuantizedOps)
caffe2/test:quantization - test_mean (quantization.test_quantized.TestQNNPackOps)
caffe2/test:quantization - test_qnnpack_maxpool2d (quantization.test_quantized.TestQNNPackOps)
caffe2/test:quantization - test_qhardsigmoid (quantization.test_quantized.TestQNNPackOps)
caffe2/test:quantization - test_batch_norm3d (quantization.test_quantized.TestQuantizedOps)
caffe2/test:quantization - test_hardswish (quantization.test_quantized.TestQNNPackOps)
caffe2/test:quantization - test_qnnpack_sigmoid_sweep (quantization.test_quantized.TestQNNPackOps)
...and 10 more not shown...
FATAL: 0
TIMEOUT: 0
OMIT: 0
```
Reviewed By: jspark1105
Differential Revision: D21193552
fbshipit-source-id: f63c072d772f459ca6f0f2132aa836b2714fced1