[Trainer] Add optional communication backends for torch.distributed when using GPU #22247
Update training_args.py
7877f28f
heya5
changed the title Update training_args.py Add optional communication backends for torch.distributed when using GPU 2 years ago
heya5
changed the title Add optional communication backends for torch.distributed when using GPU [Trainer] Add optional communication backends for torch.distributed when using GPU 2 years ago
sgugger
approved these changes
on 2023-03-20
sgugger
merged
cf0af9a3
into main 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub