transformers
da971b22
- Keep relevant weights in fp32 when `model._keep_in_fp32_modules` is set even when `accelerate` is not installed (#26225)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
Keep relevant weights in fp32 when `model._keep_in_fp32_modules` is set even when `accelerate` is not installed (#26225) * fix bug where weight would not be kept in fp32 * nit * address review comments * fix test
References
#26225 - Keep relevant weights in fp32 when `model._keep_in_fp32_modules` is set even when `accelerate` is not installed
Author
fxmarty
Parents
e3a4bd2b
Loading