DeepSpeed
Support MoE for pipeline models
#5338
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
11
Changes
View On
GitHub
Support MoE for pipeline models
#5338
loadams
merged 11 commits into
deepspeedai:master
from
mosheisland:moe/pipe
mosheisland
requested a review
from
tjruwase
1 year ago
mosheisland
requested a review
from
mrwyattii
1 year ago
mosheisland
requested a review
from
duli2012
1 year ago
mosheisland
requested a review
from
awan-10
1 year ago
mosheisland
requested a review
from
arashb
1 year ago
mosheisland
requested a review
from
loadams
1 year ago
tohtana
requested a review
from
tohtana
1 year ago
MOE: Support bf16 grads reduce for pipeline
0050fdad
MOE: Use backward compatible methods to access tp info
d04cb9cc
MOE: Enable save MoE checkpoint for Pipeline models
f5c4d1a4
MOE: Support display of MoE loss for Pipeline models
a0e80123
MOE: Fix loading checkpoint of Pipeline models
b20db806
MOE: Fix group for max capacity all-reduce
a46f35d5
MOE: Enhance expert group creation for pipeline
d8ecc22b
MOE: Update global norm calculation for pipeline
0f9d2b58
mosheisland
force pushed
from
2efbf80a
to
7a5e8881
1 year ago
mosheisland
force pushed
from
7a5e8881
to
0f9d2b58
1 year ago
tohtana
approved these changes on 2024-04-04
MOE: fix style issue in pipe load_module_state_dict
b6067d7e
Merge branch 'master' into moe/pipe
526ce7f2
Merge branch 'master' into moe/pipe
4d8bf271
tohtana
enabled auto-merge
1 year ago
loadams
merged
08e0733e
into master
1 year ago
Login to write a write a comment.
Login via GitHub
Reviewers
tohtana
tjruwase
mrwyattii
duli2012
awan-10
arashb
loadams
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub