[pruner] fix activation handles logic (#61592)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/61592
Add activation handles for each layer (stored in a list), so they can each be removed.
We don't remove them in the `convert` in eager mode because we aren't modifying output/input layer dimensions. We will need this in Fx mode though.
ghstack-source-id: 133497376
Test Plan:
Added some tests to make sure `model(x)` runs without error.
`buck test mode/dev-nosan //caffe2/test:ao --
TestBasePruner`
https://pxl.cl/1LBf4
Reviewed By: z-a-f
Differential Revision: D29682789
fbshipit-source-id: 9185702736e5f7f4320754ffef441610738ac154