pytorch
23ff47cc - functionalization: fix detach() (#87750)

Commit
2 years ago
functionalization: fix detach() (#87750) `.detach()` worked in basic cases previously, but didn't properly preserve view relationships between the base and the output. This wasn't heavily tested, because autograd doesn't normally encounter `FunctionalTensorWrapper` directly, but could become more common if we fuse functionalization and autograd into a single tracing pass. This will also be a bug fix for LTC (and XLA when they use functionalization) Pull Request resolved: https://github.com/pytorch/pytorch/pull/87750 Approved by: https://github.com/ezyang
Author
Committer
Parents
Loading