pytorch
cbf05305 - don't try to set training after ScriptModule has been initialized. (#23680)

Commit
5 years ago
don't try to set training after ScriptModule has been initialized. (#23680) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/23680 Now when initializing a ScriptModule during the torch.jit.load() process, there is already a cpp module backing the thing. That means that setting training will overwrite whatever the initialized ScriptModule had. This PR splits apart the common "set up internal state" part of the Module __init__ and calls that from ScriptModule.__init__ and Module.__init__, leaving the "nn.Module-specific" part (setting `self.training`) for the nn.Module __init__ Test Plan: Imported from OSS Differential Revision: D16606959 Pulled By: suo fbshipit-source-id: f7ea6b36551ff4e4472b7685f65731d5cfab87fd
Author
suo suo
Parents
Loading