elu, selu, celu fwd-over-rev rules
elu_backward is technically binary pointwise (it has two inputs and a
single output and we get the output via a binary pointwise operation).
So I applied the "rule" for binary pointwise operations which is a mash
of the backward formulas for grad_output and self_or_result.
NB: none of these support complex numbers so I didn't keep those in mind
while writing this.
Test Plan:
- wait for tests
Fixes #ISSUE_NUMBER
Pull Request resolved: https://github.com/pytorch/pytorch/pull/75297
Approved by: https://github.com/soulitzer