Makes a streaming backward test try gradient stealing more directly (#60065)
Summary:
Closes https://github.com/pytorch/pytorch/issues/59846.
https://github.com/pytorch/pytorch/issues/59846 is likely paranoia, and some of the test_streaming_backward_* in test_cuda.py already use gradient stealing (ie, they start with `.grad`s as None before backward). Regardless, this PR augments one of the tests to stress gradient stealing a bit more directly.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/60065
Reviewed By: mrshenli
Differential Revision: D29779518
Pulled By: ngimel
fbshipit-source-id: ccbf278543c3adebe5f4ba0365b1dace9a14da9b