pytorch
98ab11a2 - separate out dynamo .requires_grad and .is_grad_enabled guards (#100570)

Commit
1 year ago
separate out dynamo .requires_grad and .is_grad_enabled guards (#100570) Fixes https://github.com/pytorch/pytorch/issues/100977 This will hopefully fix this error (from [issue](https://github.com/pytorch/pytorch/issues/99616)) This PR fixes an internal model: we were running an inductor inference graph, but `torch.is_grad_enabled()` was True, causing us to error inside of the inference graph when we encountered an out= operator. I haven't been able to create a smaller repro - before landing this, I want to create a smaller repro to convince myself of why we need to separate out these guards. Pull Request resolved: https://github.com/pytorch/pytorch/pull/100570 Approved by: https://github.com/ezyang
Author
Committer
Parents
Loading