Split SILU OpInfo
Motivation
==========
We would like to test autograd support (forward-mode AD and reverse-mode
AD) of SILU in functorch. Unfortunately the OpInfo for
nn.functional.silu has supports_autograd and supports_forward_ad as
False. This is due to nn.functional.silu not supporting complex
autograd.
Solution
========
This PR splits the OpInfo for nn.functional.silu into two. One OpInfo
tests non-complex dtypes and the other ones test complex dtypes.
Alternatives
============
- We could manually add tests in functorch
- We can add complex autograd support for SILU (I don't know how to do
this but if this is easy to do I'm happy to try)
Test Plan
=========
Run tests
Fixes #ISSUE_NUMBER
Pull Request resolved: https://github.com/pytorch/pytorch/pull/75205
Approved by: https://github.com/soulitzer