Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision #2033
Ignore native_amp when FSDP is used
e0ffd4dd
Rollback condition
6407ffb3
Fix mixed precision of bfloat16 for FSDP
6bf41ae7
brcps12
changed the title Ignore torch.autocast for mixed precision when FSDP is used Fix mixed precision of bfloat16 when using FSDP 2 years ago
brcps12
changed the title Fix mixed precision of bfloat16 when using FSDP Add FSDP allowed to wrap with `torch.autocast` for bfloat16 mixed precision 2 years ago
brcps12
changed the title Add FSDP allowed to wrap with `torch.autocast` for bfloat16 mixed precision Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision 2 years ago
pacman100
approved these changes
on 2023-10-06
pacman100
merged
5ae61111
into main 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub