Reset loss to zero on logging in Trainer to avoid bfloat16 issues #8561
make tr_loss regular float
c9d7ccfa
Revert "make tr_loss regular float"
d3e8b825
reset loss at each logging step
142d34db
keep track of total loss with _total_loss_scalar
1f40edb0
add remaining tr_loss at the end
ce16f5c7
sgugger
approved these changes
on 2020-11-17
bminixhofer
changed the title make tr_loss in Trainer regular float to avoid overflow Reset loss to zero on logging in Trainer to avoid bfloat16 issues 5 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub