pytorch
446d95a7 - [fx const fold] fix some cases with deep model hierarchy (#64945)

Commit
3 years ago
[fx const fold] fix some cases with deep model hierarchy (#64945) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/64945 In the const folding pass, we try to create `get_attr` nodes in submod_1 for `get_attr` nodes that are in the main graph. But we don't have the real attributes in submod_1. To fix this we assign main module as the owning module of sumod_1 graph. The fix above would cause problem for `call_module` node in submod_1 because during split modules gets inlined (target changed from "mod.a.b" -> "mod_a_b") to submod_1. Changing the owning module would make those `call_module nodes unable to find the referring module. To fix this, we set the targeting module to main module. Reviewed By: jfix71 Differential Revision: D30905949 fbshipit-source-id: cd67bc8fe4b8ad4344ae97b8e36753fdce3ece6d
Author
Parents
Loading