pytorch
9ba1630b - Limit world size in test_fsdp_pure_fp16 (#85957)

Commit
3 years ago
Limit world size in test_fsdp_pure_fp16 (#85957) When using more than 5 GPUs for this test the difference between the reference output tensor and the FSDP output tensor becomes to large likely due to the usual floating point inaccuracies especially as FP16 is used. So set the world size (i.e. the number of GPUs) to a maximum of 5. Fixes #78975 Pull Request resolved: https://github.com/pytorch/pytorch/pull/85957 Approved by: https://github.com/awgu
Author
Committer
Parents
Loading