pytorch
f558e86f - [FSDP] continue if param not exist in sharded load (#109116)

Commit
1 year ago
[FSDP] continue if param not exist in sharded load (#109116) If I add a param and then wrap with FSDP + load state dict, when strict=False don't hard error here. Differential Revision: [D49170812](https://our.internmc.facebook.com/intern/diff/D49170812/) Pull Request resolved: https://github.com/pytorch/pytorch/pull/109116 Approved by: https://github.com/fegin
Author
Committer
Parents
Loading