pytorch
d8c3ab11 - Fix BC by adding aten::_native_multi_head_self_attention (#72429)

Commit
2 years ago
Fix BC by adding aten::_native_multi_head_self_attention (#72429) Summary: Forward fixes https://hud2.pytorch.org/minihud?name_filter=linux-xenial-py3.7-gcc5.4%20/%20test%20(backwards_compat,%201,%201,%20linux.2xlarge) ``` The PR is introducing backward incompatible changes to the operator library. Please contact PyTorch team to confirm whether this change is wanted or not. Broken ops: [ aten::_native_multi_head_self_attention(Tensor query, Tensor qkv_weight, Tensor qkv_bias, Tensor proj_weight, Tensor proj_bias, Tensor? mask=None) -> (Tensor) ] ``` Pull Request resolved: https://github.com/pytorch/pytorch/pull/72429 Reviewed By: albanD Differential Revision: D34043480 Pulled By: janeyx99 fbshipit-source-id: 7db8c682c7d5c3bd911a87d21670b5bd2f3ad5a1 (cherry picked from commit 0985ebb7f1ac2fc7e6a1e0f387b7771dbe19d31f)
Author
Committer
Parents
Loading