Disable gradient check for linalg.eig (#109165)
Both the eager and compiled versions fail with the following message
when trying to compute the grad:
RuntimeError: linalg_eig_backward: The eigenvectors in the complex
case are specified up to multiplication by e^{i phi}. The specified
loss function depends on this quantity, so it is ill-defined.
I'm not sure if there's a way to adapt the OpInfo such that the grad is
computable, but we should at least check that the forward pass is
correct.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/109165
Approved by: https://github.com/eellison