pytorch
61aec161 - [ROCm][GHA] split 4 GPU hosts across two runners

Commit
2 years ago
[ROCm][GHA] split 4 GPU hosts across two runners Examine the runner name. If it ends with "-2" to indicate the second runner on the host, the docker run arguments will select the last two GPUs. Otherwise, select the first two GPUs on the host. Pull Request resolved: https://github.com/pytorch/pytorch/pull/76849 Approved by: https://github.com/janeyx99
Author
Committer
Parents
Loading