DeepSpeed
8d98e171
- Enable mixtral 8x7b autotp (#5257)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Hide Minimap (CTRL+M)
Commit
1 year ago
Enable mixtral 8x7b autotp (#5257) This PR aims to enable mixtral 8x7b (MoE model) autotp. Co-authored-by: Logan Adams <114770087+loadams@users.noreply.github.com>
References
#5257 - Enable mixtral 8x7b autotp
Author
Yejing-Lai
Parents
e5dd5501
Files
1
deepspeed/module_inject
auto_tp.py
Loading