onnxruntime
Keep loss_scale and Whole Loss Subgraph in FP32 during Mixed Precision Training
#4268
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
9
Changes
View On
GitHub
Keep loss_scale and Whole Loss Subgraph in FP32 during Mixed Precision Training
#4268
Lafi7e
merged 9 commits into
master
from
weicwang/lossscale
Keep loss subgraph as FP32 when mixed-p training.
24d22a27
Merge branch 'master' into weicwang/lossscale
9127101c
Fix case where there is no white-list loss op.
851220e5
Lafi7e
added
training
Lafi7e
requested a review
from
SherlockNoMad
5 years ago
Lafi7e
requested a review
from
pengwa
5 years ago
Lafi7e
requested a review
from
souptc
5 years ago
Lafi7e
requested a review
5 years ago
weixingzhang
commented on 2020-06-18
Get nodes from loss_scale instead of whitelist.
a21c36c5
Lafi7e
requested a review
from
xzhu1900
5 years ago
xzhu1900
commented on 2020-06-22
rename const variables.
132c488f
Merge branch 'master' into weicwang/lossscale
1f2d4623
xzhu1900
dismissed these changes on 2020-06-24
Merge branch 'master' into weicwang/lossscale
e40ab65e
merge from master
948480e5
Lafi7e
dismissed their stale review via
948480e5
5 years ago
Merge branch 'master' into weicwang/lossscale
287a7a6f
zhijiang-xu
approved these changes on 2020-07-02
xzhu1900
approved these changes on 2020-07-02
Lafi7e
merged
28e4c0ed
into master
5 years ago
Lafi7e
deleted the weicwang/lossscale branch
5 years ago
Login to write a write a comment.
Login via GitHub
Reviewers
xzhu1900
zhijiang-xu
weixingzhang
SherlockNoMad
pengwa
souptc
Assignees
No one assigned
Labels
training
Milestone
No milestone
Login to write a write a comment.
Login via GitHub