[quant][graphmode] Skip quantizing input and output in matched module (#32814)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/32814
We skip quantization for the intermediate values for patterns like `Conv - ReLU`,
but currently we didn't skip quantizing the input/output of the graphs of matched modules,
since we now changed the way we add observers, this also needs to be updated.
Test Plan:
python test/test_jit.py -- 'TestJit.test_insert_observers_skip_values'
Imported from OSS
Differential Revision: D20208785
fbshipit-source-id: ce30f2c4c8ce737500d0b41357c80ec8b33aecf9