pytorch
c7027f19 - [quant][fx] Add support for dynamic linear + relu fusion (INT8) (#63799)

Commit
3 years ago
[quant][fx] Add support for dynamic linear + relu fusion (INT8) (#63799) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/63799 Add a new module that can be used for module swap with the nni.LinearReLU module in convert function. Supports INT8 currently (since FP16 op doesn't have relu fusion yet). Fixes #55393 Test Plan: python test/test_quantization.py test_dynamic_fusion Imported from OSS Reviewed By: heitorschueroff Differential Revision: D30502812 fbshipit-source-id: 3668e4f001a0626d469e17ac323acf582ee28a51
Author
Parents
Loading