pytorch
26d2f4ac - Quick fix to make torch.tensor work with functorch (#62423)

Commit
3 years ago
Quick fix to make torch.tensor work with functorch (#62423) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/62423 Fixes https://github.com/facebookresearch/functorch/issues/7. functorch uses FuncTorchDynamicLayerBackMode as a mode key to wrap all tensors returned from operators in special TensorWrapper tensor extension. The problem with this is that TensorWrapper does not have storage so accessing the data_ptr (for recursive_store) internal asserts. As a quick hack, the guard added prevents functorch from wrapping the empty tensor in a TensorWrapper and instead when `tensor.to` is called later, the tensor gets wrapped. This is effectively what Ed proposed in https://github.com/facebookresearch/functorch/issues/7#issuecomment-847501020 In the long term we probably want some better way of extending `internal_new_from_data` for cases like this (where there is a mode-based dispatch key for a C++ tensor extension -- the Python case may be different). Test Plan: - Verified that this fixes functorch's problem Reviewed By: malfet Differential Revision: D29992607 Pulled By: zou3519 fbshipit-source-id: 82b713156a37d7470f8fc46e3803ee7353689a33
Author
Parents
Loading