pytorch
42423854 - add test to ensure that dist autograd contexts are cleaned up incase of nested rpcs (#28485)

Commit
5 years ago
add test to ensure that dist autograd contexts are cleaned up incase of nested rpcs (#28485) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/28485 This diff adds a test to ensure that when we have multiple nested RPCs inside a dist autograd context, the context that is created as a result of a nested rpc is cleaned up after the node creating the context exits the context manager. For example, worker 0 might send an rpc to worker 1 that results in an rpc to worker 2, so worker 2 will have 0's context, even though worker 0 never directly talked to 2. This test ensures that the context on 2 would also be cleaned up. ghstack-source-id: 92611018 Test Plan: Ran the unit test. Differential Revision: D18079212 fbshipit-source-id: d49f0cda0bf2908747546e5c8a967256c848c685
Author
Parents
Loading