[quant][graphmode] Make `aten::relu` a general op (#35420)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/35420
This PR makes `aten::relu` a general op that doesn't require observation
This means we also need to change the logic to support skipping intermediate values because
this breaks `conv - relu` pattern if it is not followed by something that is quantizable
since `conv` is quantizable, but we decide to skip observing between conv and relu.
We changed the old `skip_values` to a new `delay_observation_map_` which records information that
allow us to delay the observation of certain values until later points. In the case of `conv - relu`
pattern, we delayed the observation of output of `conv` and observe the output of `relu` instead.
Test Plan:
python test/test_jit.py
Imported from OSS
Differential Revision: D20655309
fbshipit-source-id: 37dbe8a5e2f4cd7582ed67c405f9cf437dd00dbe