pytorch
3957ed41 - [DDP] Disable reducer hooks from running outside of DDP backwards. (#60921)

Commit
4 years ago
[DDP] Disable reducer hooks from running outside of DDP backwards. (#60921) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/60921 Sometimes local modules can fire hooks (such as when user calls backward after using `ddp_module.module` explicitly). This isn't supported behavior and can cause issues with various state and gradient reduction we run in DDP, so it's best to disable this entirely. ghstack-source-id: 132739311 Test Plan: CI Reviewed By: SciPioneer Differential Revision: D29435737 fbshipit-source-id: fef76a0dd2955c432131632fb81dde4a4982ad91
Author
Parents
Loading