pytorch
b01d1ad1 - [FSDP] Fix summon_full_params when not sharded (#72572)

Commit
2 years ago
[FSDP] Fix summon_full_params when not sharded (#72572) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/72572 Use `continue` instead of `pass` which would result in AttributeError because `_full_param_padded` is not created for unsharded parameter when world_size == 1. Add a test to cover this case. ghstack-source-id: 149111044 Test Plan: CI Reviewed By: zhaojuanmao Differential Revision: D34101124 fbshipit-source-id: 71d82bf94a091ef90f52b31c213192a5dd547332 (cherry picked from commit cc7899a5eaf5bc091eb772ade68a0a24a1fdab80)
Author
Committer
Parents
Loading