Add --use_multi_head_attention in transformers fusion #14198
add --use_cross_attention in transformers fusion
64f23c36
tianleiwu
marked this pull request as draft 2 years ago
Merge branch 'main' into tlwu/cross_attention_fusion
da5a244e
change CrossAttention to MultiHeadAttention
72379cff
tianleiwu
changed the title Add --use_cross_attention in transformers fusion Add --use_multi_head_attention in transformers fusion 2 years ago
tianleiwu
marked this pull request as ready for review 2 years ago
tianleiwu
marked this pull request as draft 2 years ago
add test case
cb27f6e5
tianleiwu
marked this pull request as ready for review 2 years ago
wangyems
approved these changes
on 2023-01-11
tianleiwu
merged
012b34dc
into main 2 years ago
tianleiwu
deleted the tlwu/cross_attention_fusion branch 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub