pytorch
4e1c0755 - log_sigmoid: Use log1p for improved precision (#66441)

Commit
3 years ago
log_sigmoid: Use log1p for improved precision (#66441) Summary: Fixes https://github.com/pytorch/pytorch/issues/20972 log_sigmoid calculates something like `log(1 + x)` where x is always a positive number less than one. This wastes floating point precision because the exponent always becomes zero. Instead, using `log1p(x)` gives the full mantissa precision around `x=0`. This also fixes infinity propagation because the old code does, `exp(in - in)` when `in` is negative. Which for infinity, results in a NaN instead of 0. cc albanD mruberry jbschlosser walterddr Pull Request resolved: https://github.com/pytorch/pytorch/pull/66441 Reviewed By: bdhirsh Differential Revision: D31619630 Pulled By: albanD fbshipit-source-id: e7867f3459a91e944b92f8ca42b6e0697b13f89b
Author
Parents
Loading