pytorch
da9af986 - [FSDP][4/N] Refactor func to share state/init handle attrs (#90871)

Commit
2 years ago
[FSDP][4/N] Refactor func to share state/init handle attrs (#90871) For `limit_all_gathers`, if we do not enforce that they all have the same value, then the entire semantics guaranteed by the `bool` can be violated. It could be as if none of them set that value to be `True`. For `use_orig_params`, optimizer state dict assumes that the value is the same for all FSDP instances. Pull Request resolved: https://github.com/pytorch/pytorch/pull/90871 Approved by: https://github.com/mrshenli
Author
Committer
Parents
Loading