pytorch
57eda692 - [fx2trt] fix elementwise op converter with one operand being a literal and has different type (#65004)

Commit
4 years ago
[fx2trt] fix elementwise op converter with one operand being a literal and has different type (#65004) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/65004 If we have some code like `torch.add(x, 1)` and x is a float tensor then in conversion things would falling apart because currently we will add a constant layer of int32 dtype for `1` but we actually need float dtype. This diff adds an arg to `get_trt_tensor` which specify the dtype of the constant layer we would created. Also, start to add doc string for functions. Reviewed By: yinghai Differential Revision: D30852156 fbshipit-source-id: 650ce72d2794093a4616e640ea503dcc1c6b2bc4
Author
Parents
Loading