pytorch
04c50fec - [FSDP Optim State] Remove checkpoint prefix (#80480)

Commit
3 years ago
[FSDP Optim State] Remove checkpoint prefix (#80480) Remove `_checkpoint_wrapped_module` prefixes when creating keys for optimizer state_dict. Having these does not actually create an issue for optim_state_dict save / load, but we'd like to strip these keys out for downstream code that consumes these APIs typically expecting checkpointing prefixes to not exist (as checkpointing should be a transparent operation which should not change module / parameter names). Pull Request resolved: https://github.com/pytorch/pytorch/pull/80480 Approved by: https://github.com/awgu, https://github.com/fegin
Author
Committer
Parents
Loading