pytorch
9c6979a2 - [Gradient Compression] Error feedback for PowerSGD (still need to fix the key in error_dict) (#48670)

Commit
3 years ago
[Gradient Compression] Error feedback for PowerSGD (still need to fix the key in error_dict) (#48670) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/48670 Support an optional error feedback for PowerSGD -- storing the difference (i.e., the local error caused by compression) between the input gradient (adjusted by the existing error) and the gradient after decompression, and reinserting it at the next iteration. Still need to add an index field to GradBucket as the key of error_dict. This is because the current key, input tensor of the bucket, can change across steps, as the buckets may be rebuilt in forward pass in order to save peak memory usage. This is halfway of error feedback. Plan to add the new index field in a separate PR. Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202 ghstack-source-id: 117636492 Test Plan: buck test mode/dev-nosan caffe2/test/distributed:c10d -- test_powerSGD_ddp_comm_hook_nccl Reviewed By: rohan-varma Differential Revision: D25240290 fbshipit-source-id: 5b6e11e711caccfb8984ac2767dd107dbf4c9b3b
Author
Yi Wang
Parents
Loading