Avoid configuring ROCm if USE_CUDA is on. (#26910)
Summary:
Move the resolution of conflict between `USE_CUDA` and `USE_ROCM` to CMake as to effectuate:
- `USE_CUDA=ON` and CUDA is found, `USE_ROCM=ON` and ROCM is found --> fatal error
- Either `USE_CUDA=ON` and CUDA is found or `USE_ROCM=ON` and ROCM is found --> The respective GPU feature is ON
- Otherwise no GPU support
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26910
Differential Revision: D17738652
Pulled By: ezyang
fbshipit-source-id: 8e07cc7e922e0abda24a6518119c28952276064e