pytorch
6b085d5c - [Checkpoint][2D][2/N] Add traverse for distributed checkpoint to core distributed (#89398)

Commit
2 years ago
[Checkpoint][2D][2/N] Add traverse for distributed checkpoint to core distributed (#89398) This PR moves traverse and its test to torch.distributed.checkpoint. This is a pre-req for enabling 2D checkpoint. This is used when flatten nested dict and flatten sharded tensors. Docstring and comments will be added in the following PRs. Test: ``` python3 test/distributed/_tensor/parallel/test_2d_parallel.py ``` and CI Pull Request resolved: https://github.com/pytorch/pytorch/pull/89398 Approved by: https://github.com/wanchaol
Author
Committer
Parents
Loading