pytorch
f5ee46f1 - Remove custom function in no_grad block error message (#33896)

Commit
4 years ago
Remove custom function in no_grad block error message (#33896) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/33896 Fixes #32625. Previously, we'd receive an error message if we have a custom function return a view of an input in a no_grad block: ``` class Alias(Function): staticmethod def forward(ctx, x): return x[:] staticmethod def backward(ctx, gx): return gx inp = torch.rand(2, requires_grad=True) with torch.no_grad(): # Used to error out output = Alias.apply(inp) ``` After this change, the error no longer happens. The behavior changes to become consistent to if we had implemented an operator that does the same thing as the custom function: - the output requires_grad - we are able to detect (and error out) if the user tries to modify the output in-place outside of the no_grad block. Test Plan: - new test Differential Revision: D20345601 Pulled By: zou3519 fbshipit-source-id: 7f95b4254f52ddbf989d26f449660403bcde1c78
Author
Parents
Loading