pytorch
566e06eb - Use _WeakTensorRef over weakref in test_autograd.py (#55726)

Commit
3 years ago
Use _WeakTensorRef over weakref in test_autograd.py (#55726) Summary: There are a few autograd tests checking for tensors leaked by reference cycles. This changes them to use `_WeakTensorRef` over `weakref`. `_WeakTensorRef`, added in https://github.com/pytorch/pytorch/issues/52874, accesses the C++ level `TensorImpl` reference count, compared to `weakref` which accesses python refcounts and so can only tell if the python wrapper object gets deallocated. Not only is this less code, it's also more accurately detecting that the Tensor itself is deallocated. I didn't touch `weakref` usage in [test_anomaly_assign_parent_cleanup](https://github.com/pytorch/pytorch/blob/fc349cbcde10a5d9e8c1fc9a47bbba025a8e9018/test/test_autograd.py#L3733) and [test_nested_anomaly_printstack_cleanup](https://github.com/pytorch/pytorch/blob/fc349cbcde10a5d9e8c1fc9a47bbba025a8e9018/test/test_autograd.py#L3772) because these are intentionally testing for python object cleanup. Pull Request resolved: https://github.com/pytorch/pytorch/pull/55726 Reviewed By: ngimel Differential Revision: D27718526 Pulled By: albanD fbshipit-source-id: 37a4914360e35dd4ae8db06b29525cebec4d4b84
Author
Parents
Loading