pytorch
5cbe1e49 - [dist_optim] add distributed functional Adam optimizer (#50624)

Commit
5 years ago
[dist_optim] add distributed functional Adam optimizer (#50624) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/50624 Add TorchScript compatible Adam functional optimizer to distributed optimizer Test Plan: Imported from OSS Reviewed By: rohan-varma Differential Revision: D25932770 Pulled By: wanchaol fbshipit-source-id: cab3f1164c76186969c284a2c52481b79bbb7190
Author
Parents
Loading