Enable users to use their own loss functions + deal with prefetching for grad accum #34198
bookmark
57c698fe
Bookmark
3c579479
Bookmark
1d57bd8f
Actually implement
15e61f12
Pass in kwarg explicitly
928e9271
Adjust for if we do or don't have labels
b59c8f16
Bookmark fix for od
79f9479e
bookmark
13f33692
Fin
8080f28a
muellerzr
marked this pull request as ready for review 1 year ago
closer
13160e08
Negate accelerate grad accum div
6fa155a8
Fixup not training long enough
c2a705fd
Add in compute_loss to take full model output
ac04e610
Document
af8411b8
compute_loss -> compute_loss_fn
a5fac5a0
Add a test
39d8f28c
muellerzr
changed the title [DRAFT] Enable users to use their own loss functions + deal with prefetching for grad accum Enable users to use their own loss functions + deal with prefetching for grad accum 1 year ago
Refactor
42849302
Refactor
932a4910
Uncomment tests
2a6b0383
Update tests/trainer/test_trainer.py
54d10ded
muellerzr
merged
6ba31a8a
into main 1 year ago
muellerzr
deleted the muellerzr-fix-loss-calc branch 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub