[Dynamic RPC] Allow newly joined ranks to communicate with existing ranks (#73373)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/73373
This PR allows for newly joined ranks in Dynamic RPC to communicate with ranks that have already joined the group. That is, rank N will be able to run RPC against all ranks <= N.
Previously:
Process 1 (init):
```python
init_rpc("worker0", rank=0)
```
Process2 (command against a rank that already joined, would fail):
```python
init_rpc("worker1", rank=1)
rpc.sync("worker0", torch.add, (torch.tensor(1), torch.tensor(1)))
```
Now:
Above scenario succeeds
Test:
`pytest test/distributed/rpc/test_tensorpipe_agent.py -vsk test_init_rpc_without_world_size`
Test Plan: Imported from OSS
Reviewed By: mrshenli
Differential Revision: D35052544
Pulled By: H-Huang
fbshipit-source-id: dba48b216731c27730e7d46aefd9e7191c792170
(cherry picked from commit f3c42d8482c933fd746d4da8e64fa40cdf92a221)