pytorch
e57a1197 - Remove autograd copy_ specific isFloatingPoint (#28279)

Commit
5 years ago
Remove autograd copy_ specific isFloatingPoint (#28279) Summary: Remove autograd copy_ specific isFloatingPoint and use c10's isFloatingType (and isComplexType). Before this, .to or .copy_ would drop requires_grad for bfloat16 as the floating types were only considered to be double, float, and half. Pull Request resolved: https://github.com/pytorch/pytorch/pull/28279 Differential Revision: D18176084 Pulled By: izdeby fbshipit-source-id: 8a005a6105e4a827be5c8163135e693a7daae4f4
Author
Parents
Loading