pytorch
1dae59ba - [Checkpoint][2D][1/N] Add dedup_tensors for distributed checkpoint to core distributed (#89399)

Commit
3 years ago
[Checkpoint][2D][1/N] Add dedup_tensors for distributed checkpoint to core distributed (#89399) This PR moves dedup_tensors and its test to torch.distributed.checkpoint. This is a pre-req for enabling 2D checkpoint. This removes duplicated shards in list of SavePlan. It is used when saving DT with replicated placement. Docstring and comments will be added in the following PRs. Pull Request resolved: https://github.com/pytorch/pytorch/pull/89399 Approved by: https://github.com/wanchaol
Author
Committer
Parents
Loading