[BE][FSDP] Remove unneeded `torch.cuda.synchronize()` (#80868)
We do not need `torch.cuda.synchronize()` because `summon_full_params()` calls it.
https://github.com/pytorch/pytorch/blob/f7678055033045688ae5916c8df72f5107d86a4a/torch/distributed/fsdp/fully_sharded_data_parallel.py#L2516
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80868
Approved by: https://github.com/rohan-varma