Factor out numerical logic (#54479)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/54479
This change is similar to #54049 in that it helps us factor out some code that can be used in both fast and slow versions of gradcheck.
- `compute_gradient` and `compute_numerical_jacobian_cols` have fewer responsibilities:
- compute_numerical_jacobian_cols essentially only handles the complexity of complex derivatives
- compute_gradient handles only finite differencing (and doesn't worry about different layouts and indexing into the input tensor)
- we have two stages again where we first compute the columns separately, then combine them
Test Plan: Imported from OSS
Reviewed By: jbschlosser
Differential Revision: D27728727
Pulled By: soulitzer
fbshipit-source-id: fad3d5c1a91882621039beae3d0ecf633c19c28c