[Pytorch Mobile] optimize_for_mobile: Remove dropout from any function (#53846)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/53846
Theres already a varient of removeDropout that takes in a graph. So just switch to calling that one. It doesnt error check that the module isnt in training mode (because it doenst have a module) but optimize_for_mobile guarantees the cloned_module is in eval mode.
ghstack-source-id: 124544216
Test Plan: called optimize on forward and foo, both contained dropouts, both dropouts removed. Called both functions afterwords to verify they ran and gave the same output. {P308987364}
Reviewed By: kimishpatel
Differential Revision: D26986251
fbshipit-source-id: 085e08cbaa982aa08803a718fee4380af5f86b78