pytorch
b8b74800 - [Checkpoint][2D][6/N] Add optimizer and update default_planner to core distributed (#90212)

Commit
2 years ago
[Checkpoint][2D][6/N] Add optimizer and update default_planner to core distributed (#90212) This is the last PR for integrating 2D into core distributed. This PR does the following: 1. Add optimizer.py: this adds ability to load a state_dict in conjunction with FSDP sharded optimzer state. 2. Update default_planner.py to support 2D checkpoint. 3. Add test_fsdp_optim_state.py as a unit test for No. 1. 4. Fix bug in torch/testing/_internal/distributed/checkpoint_utils.py 5. Rename the filename for the APIs that should be private. Will organize and cleanup further in following PRs. #90328 Docstring and integration test will be added in the following PRs. Pull Request resolved: https://github.com/pytorch/pytorch/pull/90212 Approved by: https://github.com/wanchaol
Author
Committer
Parents
Loading