pytorch
ab1e88e3 - [Quant][Eager][improvement] Added 4 bit support for eager mode quantization flow (reland PR 69806) (#72277)

Commit
3 years ago
[Quant][Eager][improvement] Added 4 bit support for eager mode quantization flow (reland PR 69806) (#72277) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/72277 Minor modifications were made to support 4 bit embedding quantized module in eager mode quantization flow and to allow for testing of the changes Test Plan: In pytorch main dir, execute ``` python test_quantization.py TestPostTrainingStatic.test_quantized_embedding ``` Reviewed By: jerryzh168 Differential Revision: D33994545 Pulled By: dzdang fbshipit-source-id: faafad54b7b07fc393904ba55c2b2ac934c276f7 (cherry picked from commit 042ffb2091dd85a6273c97438dc7e913fcd03224)
Author
Committer
Parents
Loading