accelerate
5ae61111 - Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision (#2033)

Commit
2 years ago
Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision (#2033) * Ignore native_amp when FSDP is used * Rollback condition * Fix mixed precision of bfloat16 for FSDP
Author
Parents
Loading