[dynamo] fix disable_saved_tensors_hooks - graph break (#106875)
```python
def wrapper_fn(x):
with torch.autograd.graph.disable_saved_tensors_hooks("ERROR"):
y = x + 1
print("HI")
return y + 2
x = torch.randn(())
a = wrapper_fn(x)
opt = torch.compile(wrapper_fn, backend='eager', fullgraph=False)
e = opt(x)
```
Without the fix fails with,
```
Traceback (most recent call last):
File "/home/kshiteej/Pytorch/pytorch_functorch/test/test_trace_grad.py", line 182, in <module>
e = opt(x)
File "/home/kshiteej/Pytorch/pytorch_functorch/torch/_dynamo/eval_frame.py", line 333, in _fn
return fn(*args, **kwargs)
File "/home/kshiteej/Pytorch/pytorch_functorch/test/test_trace_grad.py", line 165, in wrapper_fn
def wrapper_fn(x):
AttributeError: module 'torch.autograd.graph' has no attribute 'disable_saved_tensors_hook'
```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/106875
Approved by: https://github.com/zou3519