pytorch
be682bef - [FSDP] Add `use_orig_params` (#84911)

Commit
2 years ago
[FSDP] Add `use_orig_params` (#84911) **Overview** This PR adds the option to use the original parameters via `use_orig_params=True` in the FSDP constructor. - This exposes the original parameters rather than the `FlatParameter`s from `named_parameters()`, which means that the optimizer runs on the original parameters. Hence, users may assign original parameters from the same `FlatParameter` to different parameter groups. - This enables decoupling the original parameter variables from their storage without changing the variables themselves, which is critical for our upcoming execution-order-based non-recursive wrapping policy. For more detailed design explanation, refer to the Quip shared internally. **Follow-Ups** See 85831 (removing link to avoid spamming the issue whenever I update this PR). `test_fsdp_use_orig_params.py` adds ~4 min 46 seconds to the TTS on the AWS cluster. Pull Request resolved: https://github.com/pytorch/pytorch/pull/84911 Approved by: https://github.com/rohan-varma
Author
Committer
Parents
Loading