pytorch
f0e8e7cf - [DTensor] Supported `foreach=False` for `clip_grad_norm_` (#120238)

Commit
253 days ago
[DTensor] Supported `foreach=False` for `clip_grad_norm_` (#120238) This PR adds `DTensor` support for `aten.linalg_vector_norm.default` and `aten.stack.default` so that we can run `clip_grad_norm_` (with `foreach=False`). To implement `linalg_vector_norm`, we introduce a `_NormPartial` placement since the reduction op for norm is the norm itself. Pull Request resolved: https://github.com/pytorch/pytorch/pull/120238 Approved by: https://github.com/wanchaol
Author
Committer
Parents
Loading