accelerate
Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision
#2033
Merged

Commits
  • Ignore native_amp when FSDP is used
    brcps12 committed 2 years ago
  • Rollback condition
    brcps12 committed 2 years ago
  • Fix mixed precision of bfloat16 for FSDP
    brcps12 committed 2 years ago
Loading