pytorch
c3d40fdf - [ATen] Use expect_contiguous in layer_norm (#58067)

Commit
3 years ago
[ATen] Use expect_contiguous in layer_norm (#58067) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/58067 - Use expect_contiguous in layer_norm to avoid unnecessary refcount bumps when the tensors are contiguous - Clean up some leftovers from the hacky wrappers removal cleanup: use c10::MaybeOwned<Tensor> for bias tensors - Skip dispatcher for at::empty in the layer_norm impl in Static Runtime Test Plan: CI Reviewed By: swolchok Differential Revision: D28214298 fbshipit-source-id: 73150fa62d5c18f41a2264f8e56bbe5e377ad045
Author
Hao Lu
Parents
Loading