[TensorPipe] Do not require user to provide worker name-to-rank map (#38052)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/38052
The initial version of the TensorPipe agent required the user to specify the full map between workers' names and their ids, on each worker. However it's enough for each worker to just specify their name and id, as these can then be exchanged using the store.
Addresses #37784, although I think we can go further and use the store to also automatically assign ranks to workers, so that the user only needs to specify a name.
ghstack-source-id: 103741595
(Note: this ignores all push blocking failures!)
Test Plan:
On worker 0:
```
In [1]: import os
...: import torch
...: import torch.distributed.rpc as rpc
...: os.environ["MASTER_ADDR"] = "127.0.0.1"
...: os.environ["MASTER_PORT"] = "8765"
In [2]: rpc.init_rpc(name="foo", rank=0, backend=rpc.backend_registry.BackendType.TENSORPIPE, world_size=2)
In [3]: rpc.rpc_sync("bar", torch.add, args=(torch.full((2,2), 1), torch.full((2,2), 2)))
Out[3]:
tensor([[3., 3.],
[3., 3.]])
In [4]: rpc.rpc_sync("bar", torch.add, args=(1, 2))
Out[4]: 3
```
On worker 1:
```
In [1]: import os
...: import torch
...: import torch.distributed.rpc as rpc
...: os.environ["MASTER_ADDR"] = "127.0.0.1"
...: os.environ["MASTER_PORT"] = "8765"
In [2]: rpc.init_rpc(name="bar", rank=1, backend=rpc.backend_registry.BackendType.TENSORPIPE, world_size=2)
```
Then also tested by adding `rpc_backend_options=rpc.TensorPipeRpcBackendOptions(init_method="file:///tmp/init/foo")` to `rpc_init`.
Differential Revision: D21463833
fbshipit-source-id: b53d7af6fc060789358ac845aa1898ddea6e8f31