[DDP] Fix when buffers are reassigned in module (#64472)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/64472
Sometimes, user module can reassign tensor buffer, as in:
```
self.buffer = torch.randn(1, 2) # in init
self.buffer += 1 # in forward
```
in this case, `self.modules_buffers` will become outdated and we should
repopulate self.modules_buffers if we need to sync module buffers.
See https://github.com/pytorch/pytorch/issues/63916 for full description of the
issue.
ghstack-source-id: 137526309
Test Plan: CI
Reviewed By: zhaojuanmao
Differential Revision: D30745921
fbshipit-source-id: 25eb1edbf445703a481802e07f3058d38ea6fc64