Fix non-deterministic RNG behavior in dist_optimizer tests (#35425)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/35425
Prior to this commit, dist_optimizer_test.py uses torch.manual_seed(0)
to set RNG state. However, multiple RPC threads from the same
process share the same RNG instance. Therefore, even though we
reset the RNG state before every torch.rand usage, background RPC
thread could still mess up draw order in the RNG, leading to
non-deterministic behavior. This commit address this problem by
avoid using the default RNG.
Test Plan: Imported from OSS
Differential Revision: D20657589
Pulled By: mrshenli
fbshipit-source-id: 0f45b11a902317f15f3ee8448bc240f5723075a5