pytorch
4b00bce1 - [Gradient Compression] Introduce fp16_compress_wrapper in ddp_comm_hooks.rst (#54052)

Commit
3 years ago
[Gradient Compression] Introduce fp16_compress_wrapper in ddp_comm_hooks.rst (#54052) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/54052 Introduce `fp16_compress_wrapper`, which can give some speedup on top of some gradient compression algorithms like PowerSGD. ghstack-source-id: 124001805 Test Plan: {F509205173} Reviewed By: iseessel Differential Revision: D27076064 fbshipit-source-id: 4845a14854cafe2112c0caefc1e2532efe9d3ed8
Author
Yi Wang
Parents
Loading