pytorch
32fde537 - [FSDP][5/N] Add manual "wrapping" support for `fully_shard` (#90874)

Commit
2 years ago
[FSDP][5/N] Add manual "wrapping" support for `fully_shard` (#90874) This PR adds manual "wrapping" support for `fully_shard`. For example, for ``` fully_shard(mod.sub) fully_shard(mod) ``` `mod.sub` and `mod` will share the same FSDP data structures. To have parity with wrapper FSDP, this PR only checks support for when each manual application of `fully_shard` passes `policy=None`. Hybrid auto / manual wrapping is not in scope for this PR since it is not supported for wrapper FSDP either. I can follow up to either add support properly or raise and error early. Pull Request resolved: https://github.com/pytorch/pytorch/pull/90874 Approved by: https://github.com/mrshenli
Author
Committer
Parents
Loading