onnxruntime
Keep loss_scale and Whole Loss Subgraph in FP32 during Mixed Precision Training
#4268
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
9
Changes
View On
GitHub
Commits
Keep loss subgraph as FP32 when mixed-p training.
Vincent Wang
committed
5 years ago
Merge branch 'master' into weicwang/lossscale
Vincent Wang
committed
5 years ago
Fix case where there is no white-list loss op.
Vincent Wang
committed
5 years ago
Get nodes from loss_scale instead of whitelist.
Vincent Wang
committed
5 years ago
rename const variables.
Vincent Wang
committed
5 years ago
Merge branch 'master' into weicwang/lossscale
Vincent Wang
committed
5 years ago
Merge branch 'master' into weicwang/lossscale
Vincent Wang
committed
5 years ago
merge from master
Vincent Wang
committed
5 years ago
Merge branch 'master' into weicwang/lossscale
Vincent Wang
committed
5 years ago
Loading