OneDNN MaxPooling: reduce memory use for inference path (#52728)
Summary:
For OneDNN MaxPooling training, it will save indices as a workspace for backward, but for inference, indices are not necessary, this PR will make check to avoid saving indices to reduce memory use for inference path.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/52728
Reviewed By: jbschlosser
Differential Revision: D27062435
Pulled By: VitalyFedyunin
fbshipit-source-id: 9e70268a8ba491a7914b980079c0945d753cd4f3