DeepSpeed
f82846d7
- Adding additional instructiosn in the compression tutorial on pre-training distillation and quantization for GPT (#2197)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
3 years ago
Adding additional instructiosn in the compression tutorial on pre-training distillation and quantization for GPT (#2197) Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com> Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
References
#2197 - Adding the compression tutorial on GPT distillation and quantization
Author
minjiaz
Parents
b62e0cc5
Loading