Add fake quantize operator that works in backward pass (#40532)
Summary:
This diff adds FakeQuantizeWithBackward. This works the same way as the regular FakeQuantize module, allowing QAT to occur in the forward pass, except it has an additional quantize_backward parameter. When quantize_backward is enabled, the gradients are fake quantized as well (dynamically, using hard-coded values). This allows the user to see whether there would be a significant loss of accuracy if the gradients were quantized in their model.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/40532
Test Plan: The relevant test for this can be run using `python test/test_quantization.py TestQATBackward.test_forward_and_backward`
Reviewed By: supriyar
Differential Revision: D22217029
Pulled By: durumu
fbshipit-source-id: 7055a2cdafcf022f1ea11c3442721ae146d2b3f2