pytorch
d0cb26ba - [DDP] Fix logging iterations (#64411)

Commit
3 years ago
[DDP] Fix logging iterations (#64411) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/64411 These are not actually the training iterations, but are offset by how frequently DDP stats collection actually runs (default being kDDPRuntimeLoggingSampleRate = 100). So with this change, they are actually logged to scuba every: 10, 10 * 100, 40 * 100, etc iterations. Test Plan: CI Reviewed By: zhaojuanmao Differential Revision: D30718274 fbshipit-source-id: 146bd2428753c93363bee37e487f40104fce3c18
Author
Parents
Loading