DeepSpeed
ddeb0c19 - Fix patch for parameter partitioning in zero.Init() (#6388)

Commit
1 year ago
Fix patch for parameter partitioning in zero.Init() (#6388) This PR fixes an issue addressed in #5921. With this change, we only apply the patch for parameter partitioning to classes that have `__init__` so that we can avoid applying the patch multiple times. The class that does not have `__init__` now uses its superclass's one. So this PR also applies the patch to the root class, `torch.nn.modules.module.Module`. Thanks @VeryLazyBoy for the report and initial solution. --------- Co-authored-by: Logan Adams <114770087+loadams@users.noreply.github.com>
Author
Parents
Loading