pytorch
fd82f118 - [lite interpreter][hack] Add batch_norm_update_stats if batchnorm and training are present (#100134)

Commit
1 year ago
[lite interpreter][hack] Add batch_norm_update_stats if batchnorm and training are present (#100134) Summary: not sure how the train bool to batch_norm gets set. But its not the is_training module level flag. We get weird behavior for teams trying to do on device training because of this Test Plan: ci Differential Revision: D45335791 Pull Request resolved: https://github.com/pytorch/pytorch/pull/100134 Approved by: https://github.com/larryliu0820
Author
Committer
Parents
Loading