pytorch
0b0e6551 - [FSDP] Fix param name prefixes for ignored modules (#79955)

Commit
2 years ago
[FSDP] Fix param name prefixes for ignored modules (#79955) For ignored modules' parameters, we should also clean their parameter names since they will have the FSDP-specific prefixes. This change only affects the prefixed parameter name keys in `full_optim_state_dict()` (i.e. optim state dict saving). Not having this change does not actually violate the correctness of the optim state dict save-load flow because it only requires that the keys are unique and internally consistent. Either way, this PR explicitly adds the specification now that the parameter keys in the optim state dict should match the keys of full model state dict. Pull Request resolved: https://github.com/pytorch/pytorch/pull/79955 Approved by: https://github.com/rohan-varma
Author
Committer
Parents
Loading