transformers
cab048fb
- [`Trainer`] Force `is_model_parallel` when model is loaded in multiple GPUs using `accelerate` (#22532)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
3 years ago
[`Trainer`] Force `is_model_parallel` when model is loaded in multiple GPUs using `accelerate` (#22532) * add `is_model_parallel` arg on Trainer * add warning * adapt from suggestions * revert t5 changes * remove commas * adapt from suggestions
References
#22532 - [`Trainer`] Force `is_model_parallel` when model is loaded in multiple GPUs using `accelerate`
Author
younesbelkada
Parents
aecbcb36
Loading