fixed the issue of DPO trainer that using one node and mutiple GPUs and set the device_map='auto' #29695
fixed the issue of DPO trainer that using one node and mutiple GPUs
7a17fea9
before update, add the assert
91b729c3
run the ruff formatter
4e2f65b6
Merge branch 'main' into feature/fix_dpo_los_update
7d4d6126
Update src/transformers/trainer.py
155ad13f
muellerzr
approved these changes
on 2024-03-19
remember to do make style and make quality before commit
acae705b
Update src/transformers/trainer.py
3ca37964
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub