pytorch
22e7514a - [Checkpoint][2D][3/N] Add nested_tensors for distributed checkpoint to core distributed (#89501)

Commit
3 years ago
[Checkpoint][2D][3/N] Add nested_tensors for distributed checkpoint to core distributed (#89501) This PR moves nested_tensors to torch.distributed.checkpoint. This is a pre-req for enabling 2D checkpoint. This flattens sharded tensors in state_dict. It is used when saving and loading FSDP SHARDED_STATE_DICT. Docstring, individual and integration test will be added in the following PRs. Pull Request resolved: https://github.com/pytorch/pytorch/pull/89501 Approved by: https://github.com/wanchaol
Author
Committer
Parents
Loading