pytorch
512a3a48 - sync AveragedModel buffers when use_buffers=False (#84054)

Commit
3 years ago
sync AveragedModel buffers when use_buffers=False (#84054) Fixes #84053 As described in the issue, the AveragedModel will deep copy the model during initialization, which means that the buffers in the averaged model cannot be updated together with the model. One solution is to make the buffers equal to the source model every time when calling `update_parameters`. Pull Request resolved: https://github.com/pytorch/pytorch/pull/84054 Approved by: https://github.com/samdow
Author
Committer
Parents
Loading