accelerate
Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision
#2033
Merged

Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision #2033

brcps12
brcps12 Ignore native_amp when FSDP is used
e0ffd4dd
muellerzr muellerzr requested a review from pacman100 pacman100 2 years ago
HuggingFaceDocBuilderDev
pacman100
pacman100
brcps12 Rollback condition
6407ffb3
brcps12 Fix mixed precision of bfloat16 for FSDP
6bf41ae7
brcps12
brcps12 brcps12 changed the title Ignore torch.autocast for mixed precision when FSDP is used Fix mixed precision of bfloat16 when using FSDP 2 years ago
brcps12 brcps12 changed the title Fix mixed precision of bfloat16 when using FSDP Add FSDP allowed to wrap with `torch.autocast` for bfloat16 mixed precision 2 years ago
brcps12 brcps12 changed the title Add FSDP allowed to wrap with `torch.autocast` for bfloat16 mixed precision Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision 2 years ago
pacman100
pacman100 approved these changes on 2023-10-06
pacman100 pacman100 merged 5ae61111 into main 2 years ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone