pytorch
8d84c5f1 - Fix static data initialization deadlock on GIL (#34505)

Commit
4 years ago
Fix static data initialization deadlock on GIL (#34505) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/34505 A thread could hold GIL when calling PythonRpcHandler::getInstance(), meantime another thread could have been doing static data initialization by calling `new PythonRpcHandler()`, inside of which GIL is also required. Static data initialization is thread-safe, so the thread holding the GIL will wait for the other thread to finish static data initializating before going forward. Because the initialization can't proceed without GIL, there is a deadlock. We ask the calling thread to release GIL to avoid this situation. ghstack-source-id: 99893858 Test Plan: ``` buck test mode/dev-nosan //caffe2/test/distributed/rpc:dist_autograd_spawn -- 'test_backward_simple_script_call \(test_dist_autograd_spawn\.DistAutogradTestWithSpawn\)' --stress-runs 100 ``` Differential Revision: D7490489 fbshipit-source-id: 76f63cc7bedf088d3dbff288f53aa0bd33749255
Author
Parents
Loading