pytorch
8016d28c - [Gradient Compression] Update the comment on fp16_compress_hook (#53780)

Commit
4 years ago
[Gradient Compression] Update the comment on fp16_compress_hook (#53780) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/53780 Update the comment, because the input data type of `fp16_compress_hook` does not have to be FP32. For example, the input dtype can also be FP64, as long as it can be casted into FP16. ghstack-source-id: 123680621 Test Plan: N/A Reviewed By: iseessel Differential Revision: D26967224 fbshipit-source-id: 26d79a3629a597e6335b6f59c97d25a764a8ed80
Author
Yi Wang
Parents
Loading