pytorch
52249067
- Spread distributed backends among all distributed shards (#86837)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
Spread distributed backends among all distributed shards (#86837) So that they can be run in parallel without stepping on each other toe Pull Request resolved: https://github.com/pytorch/pytorch/pull/86837 Approved by: https://github.com/clee2000
Author
huydhn
Committer
pytorchmergebot
Parents
48c648d7
Loading