T5Attention support for cross-attention #2654
fix AttnProcessor2_0
15321c77
added scale_qk and out_bias flags
70e7c4e2
fixed for xformers
0863561b
check if it has scale argument
8cdaf3ff
Update cross_attention.py
840af0af
check torch version
19861b91
fix sliced attn
afc92c4e
Merge branch 'main' into fix-AttnProcessor2_0
20e29db9
style
d044d2bf
kashif
changed the title Fix AttnProcessor2_0 T5Attention support for cross-attention 2 years ago
set scale
836bc8ac
fix test
9631fa24
fixed addedKV processor
6e77ced7
Merge branch 'main' into fix-AttnProcessor2_0
2bf29986
Merge branch 'main' into fix-AttnProcessor2_0
e0c8955f
revert back AttnProcessor2_0
0a96374f
if missing if
afaa1124
fix inner_dim
9e34860d
Merge branch 'main' into fix-AttnProcessor2_0
f75697f9
kashif
deleted the fix-AttnProcessor2_0 branch 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub