pytorch
400293fc - Support remote for builtin operators in distributed autograd (#28630)

Commit
5 years ago
Support remote for builtin operators in distributed autograd (#28630) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/28630 This includes: 1. Respect autograd context in rpc.remote for builtin ops 2. Force setting autograd context in RRef.to_here() even if the message for to_here() does not contain any tensor. Test Plan: Imported from OSS Differential Revision: D18138562 Pulled By: mrshenli fbshipit-source-id: a39ec83e556d19130f22eb317927241a017000ba
Author
Parents
Loading