Go
pytorch
03ed8cbf
- Workaround for bug in DistributedDataParallel (#46385)
Commit
View On
GitHub
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
3 years ago
Workaround for bug in DistributedDataParallel (#46385) Fix the DistributedDataParallelSingleProcessTest to work around a limitation in DistributedDataParallel where the batch_size needs to evenly divide by the number of GPUs used See #46175
References
#46385 - Workaround for bug in DistributedDataParallel
Author
Flamefire
Parents
286647bc
Loading