transformers
6b1ff250
- fix n_gpu count when no_cuda flag is activated (#3077)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
5 years ago
fix n_gpu count when no_cuda flag is activated (#3077) * fix n_gpu count when no_cuda flag is activated * someone was left behind
References
#3077 - fix n_gpu count when no_cuda flag is activated
Author
VictorSanh
Parents
298bed16
Loading