optimum
26b5b1ed - Remove FP16_Optimizer patch for DeepSpeed (#2213)

Commit
243 days ago
Remove FP16_Optimizer patch for DeepSpeed (#2213) Deepspeed already includes the same FusedAdam FP16_Optimizer originally from NVIDIA/apex here: https://github.com/deepspeedai/DeepSpeed/blob/master/deepspeed/runtime/fp16/fused_optimizer.py Currently this line gives a warning saying: ``` /opt/conda/envs/py_3.10/lib/python3.10/site-packages/onnxruntime/training/optim/_modifier_registry.py:56: UserWarning: Skip modifying optimizer because of optimizer name not found in the registry: accelerate.utils.deepspeed.DeepSpeedOptimizerWrapper ``` i.e. the FP16 Optimizer from onnxruntime (https://github.com/microsoft/onnxruntime/blob/main/orttraining/orttraining/python/training/optim/fp16_optimizer.py) is not actually wrapping the DeepSpeed fused Adam optimizer anyway, so this line is redundant.
Author
Parents
Loading