DeepSpeed
tolerating missing optimizer states for MoE [2nd attempt]
#4120
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
3
Changes
View On
GitHub
tolerating missing optimizer states for MoE [2nd attempt]
#4120
tjruwase
merged 3 commits into
deepspeedai:master
from
clumsy:fix/missing_moe_optim_states
clumsy
requested a review
from
jeffra
2 years ago
clumsy
requested a review
from
tjruwase
2 years ago
clumsy
requested a review
from
mrwyattii
2 years ago
clumsy
force pushed
from
343f7ef2
to
6f1be41c
2 years ago
clumsy
force pushed
from
6f1be41c
to
f19a973f
2 years ago
tjruwase
commented on 2023-08-17
clumsy
force pushed
from
f19a973f
to
6561f327
2 years ago
clumsy
commented on 2023-08-18
tjruwase
commented on 2023-08-21
clumsy
force pushed
from
830d495a
to
39aaad86
2 years ago
skipping redundant MoE optimizer state loading
b8927b8e
clumsy
force pushed
from
39aaad86
to
b8927b8e
2 years ago
Merge branch 'master' into fix/missing_moe_optim_states
275bce6a
tjruwase
approved these changes on 2023-08-28
Merge branch 'master' into fix/missing_moe_optim_states
a95b97a4
tjruwase
merged
e801e6d7
into master
2 years ago
clumsy
deleted the fix/missing_moe_optim_states branch
2 years ago
Login to write a write a comment.
Login via GitHub
Reviewers
tjruwase
jeffra
mrwyattii
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub