pytorch
f3aca45a - [BE][autograd Function] Raise an error if input is returned as-is and saved for forward or backward in setup_context (#97212)

Commit
2 years ago
[BE][autograd Function] Raise an error if input is returned as-is and saved for forward or backward in setup_context (#97212) Fixes https://github.com/pytorch/pytorch/issues/96887 We error out in BOTH the case when graph is created and when it is not created. Still bc-breaking, but not as severe because we are limiting to the case where someone uses setup_context. This makes setup_context and non-setup_context versions diverge in their behavior - With the non-setup_context version, saved variables are assumed to have the grad_fn of the inputs. - But now with the setup_context version, we produce an error for this case. Pull Request resolved: https://github.com/pytorch/pytorch/pull/97212 Approved by: https://github.com/zou3519
Author
Committer
Parents
Loading