pytorch
34b46359 - Fix forwarding/move bug (#53556)

Commit
3 years ago
Fix forwarding/move bug (#53556) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/53556 When packing a `Tensor&` (mutable lvalue reference) into an IValue, we accidentally didn't increase the refcount. This wasn't triggered anywhere, until I tried to enable backend fallbacks. Backend fallbacks for ops that have out arguments (i.e. ops that take `Tensor&` arguments and return `Tensor&` arguments) pack those returns into an IValue stack (and accidentally don't increase the refcount), then later that stack gets destructed (which decreases the refcount and possibly destroys the Tensor), and the `Tensor&` passed in as an out argument is suddenty freed memory. This PR fixes that by forwarding instead of moving when wrapping Tensors into IValues. ghstack-source-id: 125886986 (Note: this ignores all push blocking failures!) Test Plan: waitforsandcastle Reviewed By: swolchok Differential Revision: D26896507 fbshipit-source-id: 62102fa89e522699b5174c33279a2b1a775066a4
Author
Parents
Loading