accelerate
Allow FSDP to use with `torch.autocast` for bfloat16 mixed precision
#2033
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
3
Changes
View On
GitHub
Commits
Ignore native_amp when FSDP is used
brcps12
committed
2 years ago
Rollback condition
brcps12
committed
2 years ago
Fix mixed precision of bfloat16 for FSDP
brcps12
committed
2 years ago
Loading