pytorch
6fb84239 - [FSDP] Slightly refactor fx symbolic tracer (#89917)

Commit
3 years ago
[FSDP] Slightly refactor fx symbolic tracer (#89917) I made a pass over Linjian's `_symbolic_trace.py` and tidied it up a bit. Aside from simple stylistic changes, this PR makes the following changes: - Save `visited_params: Set[nn.Parameter]` to avoid linear overhead to check a parameter already being visited when appending to the parameter execution order list (`param_forward_order`) - Move the tracer patching logic to a class `_ExecOrderTracer` to have a reference to `self.exec_info` without having a fragmented 2-step initialization (like the old `_init_execution_info(root_module)` plus `_patch_tracer(tracer, root_module, execution_info)`) - Define `_ParamUsageInfo` to formalize the `Tuple[nn.Module, List[str, nn.Parameter]]` elements being mapped to in the execution info `dict`, and clarify the documentation regarding what this represents - Change the unit test to use `TestCase`, not `FSDPTest`, to avoid initializing a process group Pull Request resolved: https://github.com/pytorch/pytorch/pull/89917 Approved by: https://github.com/zhaojuanmao, https://github.com/fegin
Author
Andrew Gu
Committer
Parents
Loading