Add TORCH_CHECK for floating point exception in native_group_norm
This fixes #73194, but it should also be sufficient for `torch.group_norm` (#73188) since it has to call `native_group_norm`. Using `torch.nn.GroupNorm` throws instead a ZeroDivisionError when `num_groups=0`, we should add a nice error message also there?
Pull Request resolved: https://github.com/pytorch/pytorch/pull/75270
Approved by: https://github.com/bdhirsh