pytorch
bdcaf633 - Support for add relu functional module (#26612)

Commit
5 years ago
Support for add relu functional module (#26612) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/26612 Add support for add relu functional module, this allows for fusion of add and relu quantized operations ghstack-source-id: 91055976 Test Plan: buck test caffe2/test:quantization -- 'test_functional_module \(test_quantization\.FunctionalModuleTest\)' --print-passing-details Differential Revision: D17518268 fbshipit-source-id: e1e8b4655d6b32405863ab9d1c7da111fb4343cc
Parents
Loading