pytorch
4e53e87e - [DDP] Disable reducer hooks from running outside of DDP backwards.

Commit
3 years ago
[DDP] Disable reducer hooks from running outside of DDP backwards. Sometimes local modules can fire hooks (such as when user calls backward after using `ddp_module.module` explicitly). This isn't supported behavior and can cause issues with various state and gradient reduction we run in DDP, so it's best to disable this entirely. Differential Revision: [D29435737](https://our.internmc.facebook.com/intern/diff/D29435737/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D29435737/)! [ghstack-poisoned]
Author
Parents
  • torch/csrc/distributed/c10d
    • File
      reducer.cpp
    • File
      reducer.hpp