accelerate
5ae61111
- Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision (#2033)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision (#2033) * Ignore native_amp when FSDP is used * Rollback condition * Fix mixed precision of bfloat16 for FSDP
References
#2033 - Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision
Author
brcps12
Parents
230a5f54
Loading