pytorch
eb56b08f - [FSDP] Fix `clip_grad_norm_()` for low prec grads (#90028)

Commit
2 years ago
[FSDP] Fix `clip_grad_norm_()` for low prec grads (#90028) For PyTorch FSDP, the only way that gradients are in low precision is if `keep_low_precision_grads=True` or if the user turns on AMP. This PR adds tests for the former and improves the documentation for `clip_grad_norm_()`, especially around these non-full-precision cases. Pull Request resolved: https://github.com/pytorch/pytorch/pull/90028 Approved by: https://github.com/rohan-varma
Author
Andrew Gu
Committer
Parents
Loading