pytorch
1dba329d - Enable step_param for Adam functional optimizer (#62611)

Commit
4 years ago
Enable step_param for Adam functional optimizer (#62611) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/62611 Enables optimizer overlap with backwards in DDP for Adam. Additional optimizers, especially Adagrad will be done in follow up diffs. 1. Implement `step_param` method based on `step` in _FunctionalAdam (perf permitting we can later dedupe `step` to call `step_param` 2. Modify tests to test all current functional optimizers. ghstack-source-id: 135207143 Test Plan: CI Reviewed By: SciPioneer Differential Revision: D29891783 fbshipit-source-id: 321915982afd5cb0a9c2e43d27550f433bff00d1
Author
Parents
Loading