Refactor unittests for activation functions relu, elu, and sigmoid (#39190)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/39190
The tests covered previously by test_qrelu, test_qrelu6, test_qsigmoid, and test_qhardsigmoid are now merged into one test to ensure conciseness and reduce redundancy.
The refactoring aims to provide the basis for a more generalizable framework to test quantized activation functions and more in the future.
Test Plan:
1. On a devserver, build PyTorch from source by running the command "buck build mode/dev //caffe2:torch"
2. Run the merged unit test throught the command
"buck test mode/dev //caffe2/test:quantization -- test_qrelu"
"buck test mode/dev //caffe2/test:quantization -- test_qrelu6"
"buck test mode/dev //caffe2/test:quantization -- test_qsigmoid"
"buck test mode/dev //caffe2/test:quantization -- test_qhardsigmoid"
Reviewed By: z-a-f
Differential Revision: D21755690
fbshipit-source-id: ef62b2a50ee1c3b8607746f47fb587561e75ff25