pytorch
f3af5ba4 - [WIP] Composable API: `replicate` and `DistributedState` (#87649)

Commit
2 years ago
[WIP] Composable API: `replicate` and `DistributedState` (#87649) This PR adds the first version of the `replicate()` composable API. For this prototype version, I try to reuse as much code from existing `DistributedDataParallel` as possible, and iterate on it in later changes. The basic idea of this prototype is: - create a `ReplicateState` object. It internally uses a `ParameterList` module to hold all parameters of modules marked by `replicate()` API. - create an internal `_ddp` object, which reuses existing `DistributedDataParallel` implementation, and wraps the `ParameterList` object - install pre-forward and after-forward hooks on the root module, which calls methods of `_ddp` to run initialization and forward Pull Request resolved: https://github.com/pytorch/pytorch/pull/87649 Approved by: https://github.com/zhaojuanmao
Author
Committer
Parents
Loading