Remove wrap_dim from codegen layer. (#32738)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/32738
This is to simplify the codegen layer, with the goal of making it simple enough to just check in.
Test Plan: Imported from OSS
Differential Revision: D19610927
Pulled By: gchanan
fbshipit-source-id: 760734f579b1f655775e6d270918c361985f3743