pytorch
52e76a30 - fix ShardedTensor.gather when shard is empty (#110962)

Commit
1 year ago
fix ShardedTensor.gather when shard is empty (#110962) Summary: current ShardedTensor.gather is not working as expectation when the shard is empty on any rank The root cause is identified that when a sharded tensor has no placement on a specific rank, the metadata doesn't include that rank's placement which introduces KeyError in : ```shard_offset = shard_placement[shard. Metadata][1]``` It's fixed by adding an empty tensor check. Test Plan: before change: after change: Differential Revision: D50114085 Pull Request resolved: https://github.com/pytorch/pytorch/pull/110962 Approved by: https://github.com/wz337
Author
Committer
Parents
Loading