ns for fx: remove quantized ReLU6 from mapping
Summary:
This module is no longer swapped by FX graph mode quantization,
because it can take quantized inputs. Removing it from NS for FX
mappings.
Test plan:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/76992
Approved by: https://github.com/jerryzh168