Make fmod work with zero divisors consistently (#41948)
Summary:
Currently `torch.tensor(1, dtype=torch.int).fmod(0)` crashes (floating point exception).
This PR should fix this issue.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/41948
Reviewed By: ngimel
Differential Revision: D22771081
Pulled By: ezyang
fbshipit-source-id: a94dd35d6cd85daa2d51cae8362004e31f97989e