[FSDP] Fix `clip_grad_norm_()` when rank has no local gradients (#94835)
`functools.reduce()` requires non-empty input. We need to add a case for `len(grads) == 0`.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/94835
Approved by: https://github.com/zhaojuanmao