pytorch
2cbc0ede - [DDP] Log if graph is static at end of training (#61871)

Commit
3 years ago
[DDP] Log if graph is static at end of training (#61871) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/61871 When set_static_graph=False, the only type of dynamism we really support in DDP is dynamic set of unused parameters which must be explicitly enabled with find_unused_parameters=True. Although, some workflows have static set of unused parameters, would be good to detect and add this to logging to identify workflows that are candidates for static graph optimization. ghstack-source-id: 134371429 Test Plan: CI Reviewed By: zhaojuanmao Differential Revision: D29773962 fbshipit-source-id: 1f741984c6e6f8e3e55cf69ca719b1e25a485b13
Author
Parents
Loading