transformers
fixed the issue of DPO trainer that using one node and mutiple GPUs and set the device_map='auto'
#29695
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
7
Changes
View On
GitHub
Loading