pytorch
03529ed1 - Remove hacky double registration of to_here op in reg_distributed_ops (#39602)

Commit
4 years ago
Remove hacky double registration of to_here op in reg_distributed_ops (#39602) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/39602 This was added as a part of https://github.com/pytorch/pytorch/pull/38590 but we can use default arguments here. We use fmt:;format to bind the default value to the rpc timeout at runtime. ghstack-source-id: 105983645 Test Plan: Ci Differential Revision: D21912719 fbshipit-source-id: 7525c1322a95126f529301be142248af48565b82
Author
Parents
Loading