benchmark
78d22e56 - Support pop torch function mode stack (#133131)

Commit
1 year ago
Support pop torch function mode stack (#133131) Summary: This PR adds support for tracing `torch._C._pop_torch_function_stack()` without graph breaking and in order to verify the state change also adds replay of mutations to the torch function mode stack via side_effects appending supplemental bytecode as we do for other python mutable objects. Details: To represent the torch function mode stack symbolically a deque field is added to the instruction translator. When the InstructionTranslator is initialized, all modes are read from the current torch function mode stack, and stashed in a global weak ref for later access (using existing sources) without needing to push/pop the python/cpp torch function mode stack. During tracing, when `_pop_torch_function_stack` is encountered a value is popped from this deque and the variable tracker representing the mode is returned. To ensure the true torch function mode stack matches this state, `TorchFunctionModeStackVariable`, a singleton, is marked as mutated, this adds it to side effects, where during final codegen, side effects will codegen a call to a python helper which will update the python torch function mode stack. X-link: https://github.com/pytorch/pytorch/pull/133131 Approved by: https://github.com/jansel ghstack dependencies: #133130, #133729 Reviewed By: jeanschmidt Differential Revision: D61541474 Pulled By: mlazos fbshipit-source-id: f796356328bd4cf457bba82f3e9c86cd699ccd39
Author
Parents
Loading