pytorch
1c0a01e7 - [Distributed] Add a guard for non CPU/CUDA devices

Commit
2 years ago
[Distributed] Add a guard for non CPU/CUDA devices Summary: Adds a guard in Logger such that we won't hit the reducer_->timer_ assertion for any non CPU/CUDA devices such as lazy when they try to integrate with DDP or any other distributed APIs. Test Plan: WIP. Pull Request resolved: https://github.com/pytorch/pytorch/pull/75247 Approved by: https://github.com/wanchaol
Author
Jiewen Tan
Committer
Parents
Loading