pytorch
f8238d79 - [optim] bugfix when all parameters have no grad (#52944)

Commit
3 years ago
[optim] bugfix when all parameters have no grad (#52944) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/52944 This fix the bug introduced during refactoring optimizers https://github.com/pytorch/pytorch/pull/50411. When all parameters have no grads, we should still allows `beta` like hyper params to be defined. Reviewed By: ngimel Differential Revision: D26699827 fbshipit-source-id: 8a7074127704c7a4a1fbc17d48a81e23a649f280
Author
Wanchao Liang
Parents
Loading