Fix Dispatching not considering List[Optional[Tensor]] for dispatch (#60787)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/60787
Fixes #60461.
Previously, when one calls `self.index(indices)` using a regular `self`
Tensor and a `BatchedTensor` indices the dispatcher would not dispatch
to the Batched key. This is because the dispatcher did not extract
dispatch keys from `indices`.
Similar #58283 and #58296, this PR modifies the dispatcher to extract
dispatch keys from List[Optional[Tensor]] arguments. We do this for both
boxed and unboxed kernels.
Test Plan:
- run the test case in
https://gist.github.com/zou3519/4421df7c5271376a0ef53ca857b18740
(requires functorch). After this PR, it raises `RuntimeError: Batching
rule not implemented for aten::index.Tensor. We could not generate a
fallback.`, which shows that dispatch happened on the Batched key.
- Taking suggestions for how to write a test for this in core
Reviewed By: jbschlosser
Differential Revision: D29438611
Pulled By: zou3519
fbshipit-source-id: 77e182f763e18aa3fa857eebafa8b7f83384db71