pytorch
fdbed711 - Some fixes to smooth_l1_loss (#45532)

Commit
4 years ago
Some fixes to smooth_l1_loss (#45532) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/45532 - updated documentation - explicitly not supporting negative values for beta (previously the result was incorrect) - Removing default value for beta in the backwards function, since it's only used internally by autograd (as per convention) Test Plan: Imported from OSS Reviewed By: gchanan Differential Revision: D24002415 Pulled By: bdhirsh fbshipit-source-id: 980c141019ec2d437b771ee11fc1cec4b1fcfb48
Author
Parents
Loading