xla
Force to fp16 models to fp32 if `XLA_USE_FP16` is already set.
#6389
Merged

Force to fp16 models to fp32 if `XLA_USE_FP16` is already set. #6389

ysiraichi merged 4 commits into master from ysiraichi/fix-precision-fp16
ysiraichi
ysiraichi Refactor `default_precision_flat` function.
f342fa27
ysiraichi Force model precision to be FP32.
964e91db
ysiraichi Force precision only for XLA:CUDA.
87924c85
ysiraichi ysiraichi added xla:gpu
ysiraichi ysiraichi requested a review from golechwierowicz golechwierowicz 1 year ago
ysiraichi ysiraichi requested a review from frgossen frgossen 1 year ago
ysiraichi ysiraichi requested a review from zpcore zpcore 1 year ago
ysiraichi ysiraichi requested a review from vanbasten23 vanbasten23 1 year ago
ysiraichi ysiraichi changed the title Fallback to fp32 if `XLA_USE_FP16` is already set. Force to fp16 models to fp32 if `XLA_USE_FP16` is already set. 1 year ago
ysiraichi Add comment.
33ac439b
golechwierowicz
ysiraichi
golechwierowicz
ysiraichi
ysiraichi
ysiraichi
vanbasten23
ysiraichi
golechwierowicz
ysiraichi
ysiraichi
golechwierowicz
ysiraichi
golechwierowicz
ysiraichi
ysiraichi
ysiraichi
golechwierowicz
golechwierowicz approved these changes on 2024-01-29
ysiraichi ysiraichi merged 2a17ac20 into master 1 year ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone