pytorch
d4bbebbb - Back out "Back out "[const_fold] Set requires_grad based on the folded tensor; add device_for_folding option"" (#79696)

Commit
2 years ago
Back out "Back out "[const_fold] Set requires_grad based on the folded tensor; add device_for_folding option"" (#79696) Summary: This is an un-backout but with a small change to set the default device `device_for_folded_attrs="cuda"` instead of `"cpu"`, which should avoid BC issues for TRT lowering. Original commit changeset: 4ae1863e28ff Original Phabricator Diff: D37192230 (https://github.com/pytorch/pytorch/commit/24c2aff1b23729b5ded089ffbc81ac3239fc47bd) Test Plan: Added unit test Differential Revision: D37205432 Pull Request resolved: https://github.com/pytorch/pytorch/pull/79696 Approved by: https://github.com/dborkovic
Author
Committer
Parents
Loading