pytorch
cc685bcc - Allow for custom sharding specs to register their own ops.

Commit
2 years ago
Allow for custom sharding specs to register their own ops. Pull Request resolved: https://github.com/pytorch/pytorch/pull/76360 Customized ShardingSpecs could be entirely arbitrary and it would not be possible to handle ops for those as a result since they might not fit into the patterns supported by the in-built ShardingSpecs. As a result, we introduce a framework for a ShardingSpec to override ops as follows: 1) In the dispatch system, if a ShardingSpec has a customized op the registered op for that ShardingSpec is invoked. 2) As a result, all ChunkShardingSpec specific ops have been moved under that ShardingSpec. 3) There will be a set of ShardingSpec agnostic ops (ex: elementwise ops) which will be a set of common ops supported across any ShardingSpec. 4) If an op is not found for a particular ShardingSpec the default set of ops is searched for that op. Differential Revision: [D35917912](https://our.internmc.facebook.com/intern/diff/D35917912/) Approved by: https://github.com/wanchaol
Committer
Parents
Loading