pytorch
4350d4af - Immediately mark DLPack capsule as used after stealing the ownership (#56789)

Commit
3 years ago
Immediately mark DLPack capsule as used after stealing the ownership (#56789) Summary: After stealing the ownership of the tensor passed via DLPack capsule, PyTorch should immediately mark it as used (by changing its name to `used_dltensor`). This fix is needed because the following line may raise an exception: ```cpp py::module::import("torch.cuda").attr("init")(); ``` When an exception is raised, Tensor created by `at::fromDLPack` calls the `deleter`. However as the causple is not consumed, the producer (a library that created the capsule) also calls the `deleter`, causing a double free. Reprodcuer (I'm running this code on A100 GPU + PyTorch wheel which does not include `sm_80` support; in this configuration `torch.cuda.init` will raise a warning): ```py $ python -Werror >>> import torch.utils.dlpack >>> import cupy >>> tensor = torch.utils.dlpack.from_dlpack(cupy.arange(10).toDlpack()) free(): double free detected in tcache 2 zsh: abort (core dumped) python -Werror ``` Once this fix is merged users can now see the exception correctly: ``` A100-PCIE-40GB with CUDA capability sm_80 is not compatible with the current PyTorch installation. The current PyTorch install supports CUDA capabilities sm_37 sm_50 sm_60 sm_70. If you want to use the A100-PCIE-40GB GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/ ``` Pull Request resolved: https://github.com/pytorch/pytorch/pull/56789 Reviewed By: astaff Differential Revision: D28118512 Pulled By: mruberry fbshipit-source-id: 56992f7a3fc78d94c69513e864a473ae9587a9c8
Author
Parents
  • torch/csrc
    • File
      Module.cpp