pytorch
9439cb0e - Avoid using einsum for torch.cat DTensor propogation (#100251)

Commit
1 year ago
Avoid using einsum for torch.cat DTensor propogation (#100251) DTensor was reusing `einop_rule` to propagate sharding for torch.cat. However, einsum only supports up to 52 subscripts (i.e., input tensors). We have encountered use cases where one cat operator has more than 60 input tensors. Therefore, this commit reimplements sharding prop rule for cat without using einsum. Differential Revision: [D45435232](https://our.internmc.facebook.com/intern/diff/D45435232) Pull Request resolved: https://github.com/pytorch/pytorch/pull/100251 Approved by: https://github.com/wanchaol
Author
Committer
Parents
Loading