Fix PyTorch unet train, only use mixed precision in forward pass. (#873)
Summary:
We should only apply amp on the forward pass in training, not backward pass.
Pull Request resolved: https://github.com/pytorch/benchmark/pull/873
Reviewed By: erichan1
Differential Revision: D35765808
Pulled By: xuzhao9
fbshipit-source-id: 795e308c2ddee391952973ba901b7531f1870541