Fuse Attention For One Input bert-base-dynamic Model #3850
match additional mask path for attention
558ba71a
comment
b8d1586a
build test
c07e3aae
add test case
a0574680
tianleiwu
approved these changes
on 2020-05-07
liuziyue
merged
914aaaa1
into master 5 years ago
liuziyue
deleted the ziyl/attention branch 5 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub