transformers
fixed the issue of DPO trainer that using one node and mutiple GPUs and set the device_map='auto'
#29695
Merged

Loading