pytorch
19b6ee4d - model_dump working with delegate models (#61043)

Commit
3 years ago
model_dump working with delegate models (#61043) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/61043 Trying to make model_dump work with delegate models ghstack-source-id: 132809755 Test Plan: N509022. The data.pkl in the lowered model: ``` bash-3.2$ python -m torch.utils.show_pickle /Users/myuan/models/backend/lowered_model.pt@*/data.pkl torch.jit.backend_with_compiler_demo.LoweredModule.__torch__.___torch_mangle_5.ModuleAdd()(state= (torch.jit._pickle.restore_type_tag({'forward': torch.jit._pickle.restore_type_tag({'input_shapes': '((1, 1, 320, 240), (1, 3))', 'some_other_option': 'True'}, 'Dict[str, str]')}, 'Dict[str, Any]'), torch.jit._pickle.restore_type_tag({'forward': 'prim::Constant#1<debug_handle>271,aten::add<debug_handle>272'}, 'Dict[str, str]'), True)) ``` Comparing to data.pkl in scripted_model.pt: ``` __torch__.___torch_mangle_7.ModuleAdd()(state= {'_is_full_backward_hook': None, 'training': True}) ``` Reviewed By: Amyh11325 Differential Revision: D29464860 fbshipit-source-id: d738e98ea518339465f8e3375207cf83e3dac532
Author
Parents
Loading