pytorch
13121598 - [Pytorch, sparsity] Bug fix to update requantization and zp parameters of input (#52797)

Commit
3 years ago
[Pytorch, sparsity] Bug fix to update requantization and zp parameters of input (#52797) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/52797 Also sneaking in change to check for realloc failure for packed activation buffer FB: In dynamic quantization input's quantization scale and zero point can be different on every iterations. Thus requantization scale needs to be recomputed. Earlier bug that calculated those only at op creation time results in wrong results on subsequent runs. This diff fixes that. Test Plan: FB: buck test caffe2/torch/fb/model_optimization:sparsity_test Reviewed By: z-a-f, jiatongzhou Differential Revision: D26651968 fbshipit-source-id: e5b9acef03fc45f31c43d88a175f3a64f7dbf4bd
Author
Parents
Loading