AMP for TPUs v2 (#5148)
* tpu amp
* Add torch pin
* updates
* Clean up
* Delete test_fsdp_auto_wrap_amp.py
* updates
* Update autocast key to pick between Cuda and Xla. Unit tests
* lint
* moving code from pt to ptxla
* fixes
* lint
* move autocast+test from common.sh to run_tests.sh
* updates
* updates with pytorch
* lint
* build autocast_mode
* Disable bazel remote cache
* experiment with no new files
* revert back
* lint