pytorch
b26df43f - Fix bug where __getstate__ of DDP looks for self._replicated_tensor_module

Comment changes are shownComment changes are hidden
Commit
3 years ago
Fix bug where __getstate__ of DDP looks for self._replicated_tensor_module Pull Request resolved: https://github.com/pytorch/pytorch/pull/76349 When we are not using ReplicatedTensor in DDP and try to save a DDP module it will error out since it tries to delete the _replicated_tensor_module attribute. Fixing this by checking if this mode is enabled before triggering the delete. Differential Revision: [D35875167](https://our.internmc.facebook.com/intern/diff/D35875167/) Approved by: https://github.com/mrshenli, https://github.com/zhaojuanmao
Committer
Parents
  • test/distributed/_shard
    • File
      test_replicated_tensor.py
  • torch/nn/parallel
    • File
      distributed.py
Loading