pytorch
529ebae0 - Bugfix for TorchScript RNN RELU and TANH (#61274)

Commit
3 years ago
Bugfix for TorchScript RNN RELU and TANH (#61274) Summary: Fixes https://github.com/pytorch/pytorch/issues/28418 Related https://github.com/pytorch/pytorch/issues/32976 but has already been fixed before. TorchScript handling of GRU and LSTM have been working, but not for RNN (Tanh and ReLU). The reason is that the ```Union[Tensor, PackedSequence]``` is not supported by TorchScript. Using ```torch._jit_internal._overload_method``` in ```RNNBase::Forward``` does not work, as it seems TorchScript does not correctly use them if the method gets inherited by ```RNN```. My solution is to move the ```RNNBase::forward``` to ```RNN``` and annotate using ```torch._jit_internal._overload_method```. LSTM and GRU anyway use their own ```forward``` methods, so there seems to be no problem related to this fix. Pull Request resolved: https://github.com/pytorch/pytorch/pull/61274 Reviewed By: anjali411 Differential Revision: D32374452 Pulled By: malfet fbshipit-source-id: 77bab2469c01c5dfa5eaab229429724a4172445d Co-authored-by: Nikita Shulga <nshulga@fb.com>
Author
Parents
Loading