pytorch
d0488932 - Add Context Manager for Disabling Multithreading in Backwards, use in aot autograd (#86245)

Commit
3 years ago
Add Context Manager for Disabling Multithreading in Backwards, use in aot autograd (#86245) We were running into a few issues with running multithreaded backwards in aot_autograd: such as https://github.com/pytorch/pytorch/issues/86136, and `FakeTensorMode` getting into a weird state as a result of not executing functions completely sequentially. The multithreaded backwards is lost in translation when we trace out the backwards anyway, and adds a lot of additional complexity. Pull Request resolved: https://github.com/pytorch/pytorch/pull/86245 Approved by: https://github.com/albanD, https://github.com/yf225
Author
Committer
Parents
Loading