pytorch
2bcbea1f - [Expanded Weights] fix layer norm (#80895)

Commit
3 years ago
[Expanded Weights] fix layer norm (#80895) Opacus found that Layer Norm can fail from a wrong ordering in the ExpandedWeights code. What was happening is that all our tests had the input require grad so a layer norm check was always short circuiting in the tests, avoiding the wrong ordering. This adds a test where the input does not require gradients and fixes the issue in Layer Norm Closes #80952 Pull Request resolved: https://github.com/pytorch/pytorch/pull/80895 Approved by: https://github.com/zou3519
Author
samdow
Committer
Parents
Loading