pytorch
e5a48da6 - Allow FSDP to have ignored modules out of wrapped root (#91079)

Commit
2 years ago
Allow FSDP to have ignored modules out of wrapped root (#91079) Motivations for this change: 1. TorchRec returns inconsistent results on `m.named_parameters()` and `m.m1.named_parameters()` if m1 is a `ShardedModule`. Basically, `ShardedModule` appears in `m.named_modules()`, but its parameters are not in `m.named_parameters()`. As a result, when we identify `ShardedModule` and pass them as `ignored_modules` to FSDP, FSDP complains about key error in `_get_ignored_params`. 2. If users are manually wrapping submodules with FSDP, it could be easier for them to keep a global set of ignored parameters, instead of create a new collection for every FSDP invocation. Given the above two reasons, we allow FSDP to have ignored modules out of the wrapped root module. Differential Revision: [D42132394](https://our.internmc.facebook.com/intern/diff/D42132394) Pull Request resolved: https://github.com/pytorch/pytorch/pull/91079 Approved by: https://github.com/awgu
Author
Committer
Parents
Loading