dedupe test skipping in common_distributed and test_distributed (#38078)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/38078
`common_distributed` and `test_distributed` have some error codes that overlap but are for different reasons, for example, code 75 in `test_distributed` is "no cuda available" but in common_distributed it is "need at least 2 CUDA devices".
This is an issue because the tests in `test_distributed` now use the utils in `common_distributed`, so we could get the wrong reason for skipping tests.
It is also the source of test failures in https://github.com/pytorch/pytorch/pull/37990.
This diff makes it so that the test skipping logic is deduped and put into `common_distributed.py`, where it can be reused and then imported into `test_distributed`
ghstack-source-id: 103782583
Test Plan: CI
Differential Revision: D21466768
fbshipit-source-id: 53b5af36672ebd8b51ba8b42709d87e96cadef20