Remove PlacementSpec from ShardingSpecs. (#59990)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/59990
ShardingSpecs accepted a Device/PlacementSpec and was initially
written this way for flexibility. Although, it is slightly confusing given
there is no general use case for this. As a result, to keep things simple I've
ensured that both specs only accept devices for now.
We can always extend this to include a general PlacementSpec later on.
ghstack-source-id: 131842525
Test Plan: waitforbuildbot
Reviewed By: SciPioneer, rohan-varma
Differential Revision: D29116463
fbshipit-source-id: a6f2b3f1346ac6afab91c9595d4cae4f4da04fda