xla
[benchmarks] Default to `bfloat16` (inference) and AMP (training) precision.
#6518
Merged

[benchmarks] Default to `bfloat16` (inference) and AMP (training) precision. #6518

ysiraichi merged 5 commits into master from ysiraichi/remove-precision-flag
ysiraichi
ysiraichi ysiraichi added xla:gpu
ysiraichi
ysiraichi Remove precision flag assignment.
4864f130
ysiraichi ysiraichi force pushed from f83895aa to 4864f130 1 year ago
ysiraichi Fix rebase.
f3e8e5d7
ysiraichi Do nothing if no conversion dtype.
74c57e9d
ysiraichi ysiraichi marked this pull request as ready for review 1 year ago
ysiraichi Fix test check.
99eea4d9
ysiraichi Fix renamed method.
f1df97a8
ysiraichi ysiraichi requested a review from frgossen frgossen 1 year ago
ysiraichi ysiraichi removed review request from frgossen frgossen 1 year ago
ysiraichi ysiraichi requested a review from golechwierowicz golechwierowicz 1 year ago
ysiraichi ysiraichi requested a review from cota cota 1 year ago
ysiraichi ysiraichi requested a review from vanbasten23 vanbasten23 1 year ago
ysiraichi ysiraichi requested a review from frgossen frgossen 1 year ago
ysiraichi ysiraichi requested a review from zpcore zpcore 1 year ago
ysiraichi
frgossen
frgossen commented on 2024-02-15
frgossen
frgossen approved these changes on 2024-02-15
golechwierowicz
golechwierowicz approved these changes on 2024-02-16
ysiraichi
ysiraichi
ysiraichi
golechwierowicz
ysiraichi ysiraichi merged 20692cb0 into master 1 year ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone