benchmark
808d8be4 - Support calling __torch_function__ attribute access (#111737)

Commit
2 years ago
Support calling __torch_function__ attribute access (#111737) Summary: Triggers `__torch_function__` tracing on attribute/method/property access matching the eager behavior for non-overridden attributes/methods/properties that are present on `torch.Tensor`. Some caveats: 1. for methods there doesn't seem to be a way to check if the original implementation of a method is overridden via monkey patching or not. For example: ``` class LocalSubclass(torch.Tensor): classmethod def __torch_function__(cls, func, types, args=(), kwargs=None): if kwargs is None: kwargs = {} return super().__torch_function__(func, types, args, kwargs) x = torch.ones(2, 2).as_subclass(LocalSubclass) > x.sigmoid <built-in method sigmoid of LocalSubclass object at 0x7f8d305bb5e0> ``` There isn't a way to verify that this built-in method is equivalent to the base `torch.Tensor` implementation as each instance will have a different built-in method object that can't be traced back to the original `torch.Tensor` impl. You can check that the class itself has the original implementation via ``` > inspect.getattr_static(LocalSubclass, "sigmoid") <method 'sigmoid' of 'torch._C.TensorBase' objects> ``` But we can't detect if the user dynamically patches an object with a built-in method called sigmoid which does something completely different. 2. If a user overrides a method but calls the original implementation we will still graph break. This will require modifying `SuperVariable` (and any other way to get the original impl) to handle tensor subclasses. X-link: https://github.com/pytorch/pytorch/pull/111737 Approved by: https://github.com/jansel, https://github.com/ezyang Reviewed By: izaitsevfb Differential Revision: D50778866 Pulled By: mlazos fbshipit-source-id: e87006fd04b4818b171c18cf623fd1f0fd358196
Author
Parents
Loading