Fix GPT-2 no-past attention fusion for transformers >= 4.27 #27449
xadupre
dismissed these changes
on 2026-02-26
Rishi-Dave
dismissed their stale review
via 65bee872
14 days ago
Rishi-Dave
force pushed
from
444e9629
to
65bee872
14 days ago
Fix GPT-2 no-past attention fusion for transformers >= 4.27
2030a97f
Use verify_fusion in no-past attention test per review feedback
4b881b5a
Fix line length lint in no-past attention test
604aa892
Rishi-Dave
force pushed
from
65bee872
to
604aa892
11 days ago
tianleiwu
approved these changes
on 2026-03-05
tianleiwu
enabled auto-merge (squash) 10 days ago
tianleiwu
merged
01a56ce6
into main 10 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub