Update cuda amp to also check xla device (#63413)
Summary:
Fixes https://github.com/pytorch/xla/issues/3086. Pytorch/XLA:GPU also use cuda amp. I verified the pt/xla `test_autocast` with this fix and all test passed.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/63413
Reviewed By: ngimel
Differential Revision: D30380785
Pulled By: bdhirsh
fbshipit-source-id: fd1a1de7d224c616fc3fa90b80a688a21f6b1ecc