pytorch
5ace9122 - fix: do not reshard parameters twice (#110948)

Commit
1 year ago
fix: do not reshard parameters twice (#110948) This PR fixes potential double resharding of parameters that both: 1. requires no gradient and, 2. were used more than once during forward pass. [`_register_post_backward_hook`](https://github.com/pytorch/pytorch/blob/main/torch/distributed/fsdp/_runtime_utils.py#L1415) handles the case correctly, this PR does the same for `_register_post_backward_reshard_only_hook`. Pull Request resolved: https://github.com/pytorch/pytorch/pull/110948 Approved by: https://github.com/awgu
Author
Committer
Parents
Loading