pytorch
a3a21504 - Codegen python bindings to access attributes of grad_fn (#52451)

Commit
3 years ago
Codegen python bindings to access attributes of grad_fn (#52451) Summary: Fixes https://github.com/pytorch/pytorch/issues/9922 Adds python bindings to *selected* fields that grad_fn saves - we did not add python bindings to certain types such as 'TypeAndSize' and 'TensorGeometry'. All field names are prefixed with `_saved_` so they are easy to discern. User code should not depend on particular saved fields to exist as what grad_fn saves for the backward pass is considered an implementation detail and thus prone to change. Warning: Not all parameters that are passed in are necessarily stored to be used for the backward pass. What you put in is not necessarily what you get out either. Here we pass `kernel_size=3`, but `b.grad_fn._saved_kernel_size` returns `(3, 3)` instead of 3. It seems to vary case-by-case. For example: ``` import torch import torch.nn as nn model = nn.Conv2d(in_channels=3, out_channels=32, kernel_size=3, stride=2, padding=1, dilation=1) a = torch.ones(1, 3, 32, 32, requires_grad=True) b = model(a) print("kernel_size: ", b.grad_fn._saved_kernel_size) print("stride: ", b.grad_fn._saved_stride) # returns tuple: (3, 3) # print("dilation: ", b.grad_fn._saved_dilation) # dilation is not stored for backward pass print("padding: ", b.grad_fn._saved_padding) print("weight: ", b.grad_fn._saved_weight) ``` Sample of generated code: ``` PyObject* THPThnnConv2DBackward_self_getter(THPCppFunction *self, void *_unused) { const auto& prop = static_cast<ThnnConv2DBackward*>(self->cdata.get())->self_; return THPVariable_Wrap(prop.unpack()); } PyObject* THPThnnConv2DBackward_weight_getter(THPCppFunction *self, void *_unused) { const auto& prop = static_cast<ThnnConv2DBackward*>(self->cdata.get())->weight_; return THPVariable_Wrap(prop.unpack()); } PyObject* THPThnnConv2DBackward_kernel_size_getter(THPCppFunction *self, void *_unused) { auto prop = static_cast<ThnnConv2DBackward*>(self->cdata.get())->kernel_size; PyObject* tup = PyTuple_New((Py_ssize_t) prop.size()); for (int i = 0; i < prop.size(); i++) { PyTuple_SetItem(tup, (Py_ssize_t) i, PyLong_FromUnsignedLong((uint64_t) prop[i])); } return tup; } PyObject* THPThnnConv2DBackward_stride_getter(THPCppFunction *self, void *_unused) { auto prop = static_cast<ThnnConv2DBackward*>(self->cdata.get())->stride; PyObject* tup = PyTuple_New((Py_ssize_t) prop.size()); for (int i = 0; i < prop.size(); i++) { PyTuple_SetItem(tup, (Py_ssize_t) i, PyLong_FromUnsignedLong((uint64_t) prop[i])); } return tup; } PyObject* THPThnnConv2DBackward_padding_getter(THPCppFunction *self, void *_unused) { auto prop = static_cast<ThnnConv2DBackward*>(self->cdata.get())->padding; PyObject* tup = PyTuple_New((Py_ssize_t) prop.size()); for (int i = 0; i < prop.size(); i++) { PyTuple_SetItem(tup, (Py_ssize_t) i, PyLong_FromUnsignedLong((uint64_t) prop[i])); } return tup; } PyObject* THPThnnConv2DBackward_finput_getter(THPCppFunction *self, void *_unused) { const auto& prop = static_cast<ThnnConv2DBackward*>(self->cdata.get())->finput_; return THPVariable_Wrap(prop.unpack()); } PyObject* THPThnnConv2DBackward_fgrad_input_getter(THPCppFunction *self, void *_unused) { const auto& prop = static_cast<ThnnConv2DBackward*>(self->cdata.get())->fgrad_input_; return THPVariable_Wrap(prop.unpack()); } static struct PyGetSetDef ThnnConv2DBackward_properties[] = { THP_FUNCTION_DEFAULT_PROPERTIES, {(char*)"_saved_self", (getter)THPThnnConv2DBackward_self_getter, nullptr, nullptr, nullptr}, {(char*)"_saved_weight", (getter)THPThnnConv2DBackward_weight_getter, nullptr, nullptr, nullptr}, {(char*)"_saved_kernel_size", (getter)THPThnnConv2DBackward_kernel_size_getter, nullptr, nullptr, nullptr}, {(char*)"_saved_stride", (getter)THPThnnConv2DBackward_stride_getter, nullptr, nullptr, nullptr}, {(char*)"_saved_padding", (getter)THPThnnConv2DBackward_padding_getter, nullptr, nullptr, nullptr}, {(char*)"_saved_finput", (getter)THPThnnConv2DBackward_finput_getter, nullptr, nullptr, nullptr}, {(char*)"_saved_fgrad_input", (getter)THPThnnConv2DBackward_fgrad_input_getter, nullptr, nullptr, nullptr}, {nullptr} /* sentinel */ }; ... void initialize_autogenerated_functions() { ... static PyTypeObject ThnnConv2DBackwardClass; addClass<ThnnConv2DBackward>(ThnnConv2DBackwardClass, "ThnnConv2DBackward", ThnnConv2DBackward_properties); ... } ``` Before: ``` void initialize_autogenerated_functions() { ... static PyTypeObject ThnnConv2DBackwardClass; addClass<ThnnConv2DBackward>(ThnnConv2DBackwardClass, "ThnnConv2DBackward"); ... } ``` Pull Request resolved: https://github.com/pytorch/pytorch/pull/52451 Reviewed By: H-Huang Differential Revision: D26692633 Pulled By: soulitzer fbshipit-source-id: a09b5b8138e4641093aff68c7e9dffdbb96911b8
Author
Parents
Loading