pytorch
ff71f457 - [FSDP] Add `FSDPExtensions` for TP support (#85039)

Commit
2 years ago
[FSDP] Add `FSDPExtensions` for TP support (#85039) This adds `FSDPExtensions` to enable TP + FSDP composability. To be agnostic to both `ShardedTensor` and `DistributedTensor`, the design relies on customizable hooks. Some notes: - I preferred the `_ext` prefix (short for "extension") over `_param_extension` simply because it is shorter. It should not matter much because it is purely internal facing. Pull Request resolved: https://github.com/pytorch/pytorch/pull/85039 Approved by: https://github.com/kumpera, https://github.com/fegin
Author
Committer
Parents
Loading