Add GenericShardingSpec for generic tensor sharding. (#57409)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/57409
Full design: https://github.com/pytorch/pytorch/issues/55207
In https://github.com/pytorch/pytorch/issues/55207, we proposed
`MeshShardingSpec` as a generic sharding mechanism. However, that proposal does
not provide the flexibility to specify shards which have uneven
sizes/partitions and assumes even partitioning. Uneven partitioning is one of
the requirements of an internal use case.
As a result, instead of that we introduce a `GenericShardingSpec` which allows
specifying any arbitrary partitioning of a multi dimensional tensor. Basically
it specifies the start offsets of each shard and the length of each dim of the
shard allowing for greater flexibility
ghstack-source-id: 129604155
Test Plan:
1) unit tests
2) waitforbuildbot
Reviewed By: SciPioneer
Differential Revision: D28137616
fbshipit-source-id: 61255762485fb8fa3ec3a43c27bbb222ca25abff