pytorch
2b94c6e0
- [optim] Adam defaults to fused/foreach when on CUDA + differentiable=False
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
[optim] Adam defaults to fused/foreach when on CUDA + differentiable=False
Author
janeyx99
Committer
janeyx99
Parents
f471770f
Loading