[quant][graph] Add quantized batch_norm2d_relu to graph mode (#36552)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/36552
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.
Test Plan:
test_quantize_script.py test_batch_norm2d_relu
Imported from OSS
Differential Revision: D21075253
fbshipit-source-id: 0a07ea477cab19abf1d1b0856e623b1436240da1