DeepSpeed
DeepSpeedZeroOptimizer: refactor bit16 flattening to support more accelerators
#4833
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
11
Changes
View On
GitHub
DeepSpeedZeroOptimizer: refactor bit16 flattening to support more accelerators
#4833
tjruwase
merged 11 commits into
deepspeedai:master
from
nelyahu:zeroOptParamsFlatenning
ZeroOptimizer: avoid storage sharing when flatenning params
45d7a034
Merge branch 'microsoft:master' into zeroOptParamsFlatenning
324e33f9
nelyahu
requested a review
from
tjruwase
2 years ago
nelyahu
requested a review
from
mrwyattii
2 years ago
nelyahu
changed the title
DeepSpeedZeroOptimizer: refactor bit16 flatenning to support more accelerators
DeepSpeedZeroOptimizer: refactor bit16 flattening to support more accelerators
2 years ago
tjruwase
commented on 2023-12-18
tjruwase
commented on 2023-12-18
tjruwase
commented on 2023-12-18
tjruwase
commented on 2023-12-18
tjruwase
commented on 2023-12-18
tjruwase
assigned
tjruwase
2 years ago
Merge branch 'master' into zeroOptParamsFlatenning
2c794564
Use meta tensor instead of reconstructing CPU tensors
cbd6c70b
Merge branch 'master' into zeroOptParamsFlatenning
067e079f
tjruwase
commented on 2024-01-02
fix orig_group_numel: missing accumulation
8b0d4cee
Merge branch 'master' into zeroOptParamsFlatenning
2e41d9ef
tjruwase
approved these changes on 2024-01-03
Merge branch 'master' into zeroOptParamsFlatenning
890ae610
Merge branch 'master' into zeroOptParamsFlatenning
5ade6863
fix orig_group_numel: move accumulation two lines above
c10b626f
Merge branch 'master' into zeroOptParamsFlatenning
57b01129
tjruwase
merged
ade98365
into master
2 years ago
nelyahu
deleted the zeroOptParamsFlatenning branch
2 years ago
Login to write a write a comment.
Login via GitHub
Reviewers
tjruwase
mrwyattii
Assignees
tjruwase
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub